Crossgate Canonical Structure

Hello Experts,
Can you please clarify a query regarding Crossgate Canonical XML structure.
As per documentation available on internet
"Crossgate has standardized their processes into one canonical form for each transaction type"
we are currently dealing with EDIFACT messages and have a canonical XML structure for each message type ,e.g. for ORDERS
If we switch to some other EDI standard, say, ANSI X12, will this canonical XML structure for ANSI X12 ORDERS will remain the same or do we have different canonical XML structures for ANSI X12.
Basically, what would be the changes in mapping for an existing customer when we switch from EDIFACT to ANSI X12?
Thanks a lot!
Best Regards,
Shweta

Hello Raja,
Thanks for the reply.I too think the same because the qualifiers used in mapping instruction for EDIFACT Orders seem to be specific to EDIFACT syntax.
But then what is the advantage of having a canonical XML which would change for different EDI standard and hence would force PI mapping to be changed ?
Is this fact documented somewhere?
Thanks and best regards,
Shweta

Similar Messages

  • XSLT Transformation error:  Non-canonical structure of element name

    Good day experts,
    I have recently started using xslt, and came upon the following demo in the sdn.
    http://wiki.sdn.sap.com/wiki/display/ABAP/XMLXSLTwith+ABAP
    I have retrieved the example xml files from airplus.com, as per the instructions, and implemented the code.
    When I test the xslt transformation in se80, it transforms correctly.
    However, when I run the program, I get the following error.
    CX_XSLT_FORMAT_ERROR
    Transformation error:  Non-canonical structure of element name XML_OUTPUT   
    Is there an error in the example that I am not aware of?
    Thanks in advance,
    Johan Kriek

    Found the solution.
    You rename the tag <XML_OUTPUT> to anything else like <TEST>. And Hurray!!! it works.
    It looks like SAP is using this name internally somewhere so we are getting error when we are using same name.
    Anyways the problem is solved.
    Regards,
    Jai

  • Canonical structure mapping

    Hi Experts,
    Need some inputs on mapping from MATMAS05 idoc to OAGIS XSD.Has any one worked on the similar case?If so please give me the step to step approach,very helpful for me.
    Is there a way to modify the OAGIS xsd with out distrubing its references?
    Please provide me some inputs.
    ACS

    Hi
    This is mapping as usual. Even you can check with IBM where you can get a predefined mapping xsl. If that is useful and satisfy your requirement use it directly.
    You can edit the xsd using any xsd editor like editplus etc. But whats the requirement for changing the OAGIS format xsd. Thats very generic to satisfy the business process.
    You can create even xsd in XI editor according to the requirement.
    Suggestion is if you dont use most of the fields populated by OAGIS then its worth not using it. Instead of importing those high volume XSD create extra effort while mapping on system.
    Thanks
    Gaurav

  • Log & Transfer of Canon HF 11: error message

    Hi guys,
    I just got a new canon HF 11 (basically the same as the HF 10) and i tried to import footage using Log & Transfer but every time I click "add clip to queue" it spins a grey icon for a few seconds which then turns to a red circle with an exclamation mark inside and says "Error: No Data". I have tried giving the clip an in and out mark but this doesn't help.
    Any ideas,
    Thanks

    I have had a similar problem with my Canon HG21 which has both hard disk and 32GB flash drive. My method is to copy files from the Hard Disk to the flash drive and then place the flash drive into a card reader. I was able to see FCE downloading in log and transfer for a second or two and then it went into an idle mode and never downloaded the file.
    My solution was to delete ALL from from the flash drive followed by copying the videos and stills from the hard disk onto the clean flash drive. I realize you don't have the luxury of the hard disk, but perhaps you can extract the files using the Voltiac tool to clone the Canon structure from the camera and see if log and transfer will work directly from your hard disk on the Mac.

  • Where will PI go?

    The SAP Information Interchange is SAP's new EDI and B2B managed service. It is a managed EDI/B2B exchange service that is operated in a cloud computing environment. It utilizes SAP's NetWeaver platform as a foundation for the service which makes integrating with SAP users that have NetWeaver very simple - almost plug and play. Even those without NetWeaver can activate the service using IDocs or tRFC.
    /people/kevin.benedict/blog/2010/03/01/sapreg-information-interchange--saps-new-edi-solution
    http://www.crossgate.com/solutions-amp-services/sap-information-interchange-by-crossgate/
    http://www.gartner.com/DisplayDocument?id=1311333
    Any one knows the relationship between PI & Information Interchange?
    will SAP Information Interchange replace PI or it's only enhancement for PI?
    where will PI go?
    Regards
    Lawrensh

    I think the answer to this was provided in the blog given as reference.
    *Custom mappings ?
    2010-03-03 02:51:23 Christian Sy Business Card [Reply]
    Interesting. But how do you handle the requirement of customer-specific mappings ? I have not seen one single customer which did not require any specific mapping changes between Idoc and EDIFact (so far about EDI "standard"). Is it possible that the customer does the mapping in his own PI and then just calls a SAP EDI service which converts the document 1:1 between XML and EDIFact ?
    And what about monitoring options ? Famous topic, we sent the document, but customer did not receive it. What now ? Where to look ?
    Regards,
    CSY
    Custom mappings ?
    2010-03-03 15:16:18 Kevin Benedict Business Card [Reply]
    Good question CSY! SII (SAP Information Interchange) is a managed EDI Hub, and as such they will need to map the customer specific requirements into the Hub. However, once the map is created it will be available for re-use by all other SAP customers that exchange EDI with that particular customer. That is my understanding.
    Interoperability with PI
    2010-03-02 02:33:43 Kenneth Eriksen Business Card [Reply]
    Hi,
    Interesting news! Do you know how this will operate in relation to SAP PI? Will PI be necessary for Information Interchange to function or will Information Interchange reduce the need for companies to invest in PI as their integration platform?
    Thanks for any feedback!
    Br,
    Kenneth
    Interoperability with PI
    2010-03-02 09:14:53 Kevin Benedict Business Card [Reply]
    The SAP Information Interchange uses NetWeaver PI itself, in the EDI Hub, and can easily connect PI-to-PI with any other SAP customer using it. However, the EDI Hub can also connect to any SAP customer via tRFC, web service or IDoc as well.
    Interoperability with PI
    2010-03-04 06:14:16 Steve Sprague Business Card [Reply]
    The service can connect to an SAP application in two ways today. The standard certified processes are based on standardized IDOCs via RFC connections. You can also utilize SAP PI as a middleware and exchange XML IDOCs as well as Enterprise Services in future releases.
    From an implementation perspective - there is a managed service/hosted component that Crossgate provides that manages all the day to day operations, transactional monitoring, partner connectivity, as well as add on components such as e-Invoicing support in 38 countries.
    To other comments, here is a more defined definition:
    The SAP Information Interchange is a standalone runtime system for managing B2B mapping based on the message content. SAP Information Interchange runs in the role of a B2B/EDI translator (mapping engine). SAP Information Interchange integrates into the backend system (e.g. SAP ERP) and translates the backend system (standard IDOC formats, PI connectivity, or via Enterprise Services) to the external business partnersu2019 message format (i.e. Wal-Mart Purchase Order EDI format). The major difference is that you are no longer purchasing a EDI mapping tool and building the content. Instead, the runtime is included and pricing is based on a set of profiles representing the individual rules and requirements of the Trading Partner.
    This is the major difference to any competitor. Content, you actually are purchasing the Wal-Mart Purchase Order to Orders05 IDOC which is running and working. This content is then maintained by SAP. For example if Wal-Mart were to add an additional field or change their version as they tend to do -- that mapping format change is provided within the support contract.
    The core of the mapping methodology is based on a canonical structure. Similar to the classic way EAI solutions are implemented. The left side (or ERP interface) is maintained with SAP based on the GDT definitions and Enterprise Service descriptions. The right side (Partner Proifles) are always mapped to the canonical format. Interestingly enough, this mapping procedure has been done for years. Many of the large global enterprise have used canonical structures between their apps and B2B solutions in order to provide a buffer from internal changes as well as flexibility to consolidate systems. *
    Personally, I think that PI will remain here. It is not just restricted to EDI.

  • XML to Internal table using XSLT by CALL TRANSFORMATION error

    Dear experts,
    I have to fetch the data from an XML file using XSLT into internal tables. The XML file is very big as following:-
    <?xml version="1.0" standalone="yes" ?>
    - <Shipment>
      <shipmentID>25091203S000778</shipmentID>
      <manifestDateTime>2009-12-03T20:16:52.00</manifestDateTime>
      <shipmentFacilityNumber>025</shipmentFacilityNumber>
      <shipmentFacilityAbbreviation>CHI</shipmentFacilityAbbreviation>
      <shipmentFacilityAddress1>810 KIMBERLY DRIVE</shipmentFacilityAddress1>
      <shipmentFacilityAddress2 />
      <shipmentFacilityCity>CAROL STREAM</shipmentFacilityCity>
      <shipmentFacilityState>IL</shipmentFacilityState>
      <shipmentFacilityPostalCode>601880000</shipmentFacilityPostalCode>
      <shipmentTruckCarrierCode>X150</shipmentTruckCarrierCode>
      <shipmentSourceCode>T</shipmentSourceCode>
      <userID>CAMPOSG</userID>
    - <Delivery>
      <primaryCustomerNumber>954371</primaryCustomerNumber>
      <primaryCustomerName>MIDWEST OFFICE SUPPLY</primaryCustomerName>
      <primaryCustomerAddress1 />
      <primaryCustomerAddress2>4765 INDUSTRIAL DR</primaryCustomerAddress2>
      <primaryCustomerCity>SPRINGFIELD</primaryCustomerCity>
      <primaryCustomerState>IL</primaryCustomerState>
      <primaryCustomerPostalCode>627030000</primaryCustomerPostalCode>
      <primaryCustomerPhoneNumber>2177535555</primaryCustomerPhoneNumber>
      <shuttleStopFacilityNumber />
      <billOfLadingNumber>25HZK99</billOfLadingNumber>
      <carrierProNumber />
      <shipmentTotalCartonCount>6</shipmentTotalCartonCount>
      <shipmentTotalWeight>266</shipmentTotalWeight>
    - <order>
      <orderNumber>25HZK99</orderNumber>
      <subOrderNumber />
      <dateProcessed>2009-12-03</dateProcessed>
      <primaryOrderNumber />
      <shipTruckCode>X150</shipTruckCode>
      <shipTruckDescription>UDS - ADDISON</shipTruckDescription>
      <shipTruckPriorityCode>01</shipTruckPriorityCode>
      <shipTruckGroupCode>01</shipTruckGroupCode>
      <shipTruckDepartureTime>20.00.00</shipTruckDepartureTime>
      <shipTruckDockID>07</shipTruckDockID>
      <ldpFacilityAbbreviation />
      <shuttleAvailableIndicator>N</shuttleAvailableIndicator>
      <shuttleMessageText />
      <crossDockFacilityCode />
      <crossDockTruckCode />
      <crossDockID />
      <subsidizedFreightTruckID />
      <customerPurchaseOrderNumber>623559</customerPurchaseOrderNumber>
      <headerTypeCode>P</headerTypeCode>
      <orderTypeID>RG</orderTypeID>
      <deliveryTypeID>DS</deliveryTypeID>
      <deliveryMethodCode />
      <customerBarCode />
      <customerReferenceData>25HZK99</customerReferenceData>
      <customerReferenceText />
      <customerRouteData>ZNED UNTED</customerRouteData>
      <customerRouteText>ROUTE</customerRouteText>
      <endConsumerPurchaseOrderNumber />
      <endConsumerPurchaseOrderText />
      <endConsumerName>CHARLESTON TRANS. FACILITY</endConsumerName>
      <endConsumerAddress1>HOMEWOOD DT PROGRAM DEPT. 3</endConsumerAddress1>
      <endConsumerAddress2>17341 PALMER BLVD.</endConsumerAddress2>
      <endConsumerAddress3 />
      <endConsumerCity>HOMEWOOD</endConsumerCity>
      <endConsumerState>IL</endConsumerState>
      <endConsumerPostalCode>60430</endConsumerPostalCode>
      <endConsumerCountryCode />
      <fillFacilityNumber>025</fillFacilityNumber>
      <shpFacilityNumber>025</shpFacilityNumber>
      <homeFacilityAbbrCode>STL</homeFacilityAbbrCode>
      <homeFacilityNumber>015</homeFacilityNumber>
      <multiCartonIndicator>Y</multiCartonIndicator>
      <primaryCustomerIndicator>Y</primaryCustomerIndicator>
      <shipToCustomerNumber>954371001</shipToCustomerNumber>
      <customerCompanyID>01</customerCompanyID>
      <customerTruckID>U888</customerTruckID>
      <customerTruckDescription>UDS - ADDISON</customerTruckDescription>
      <customerTruckDockID>13</customerTruckDockID>
      <thirdPartyBillCarrier />
      <thirdPartyBillID />
      <thirdPartyBillType />
      <qualityCheckIndicator>N</qualityCheckIndicator>
      <warehouseLaydownID />
      <packListPosition>I</packListPosition>
      <preferredPackingType>CTN</preferredPackingType>
      <preferredPackingMaterial>PAPER</preferredPackingMaterial>
      <preferedPackingInstructions />
      <totalOrderCartonQty>6</totalOrderCartonQty>
      <convertAddressIndicator>N</convertAddressIndicator>
      <dealerInstructionIndicator>Y</dealerInstructionIndicator>
      <dealerinstructions1>CPO#: 623559</dealerinstructions1>
      <dealerinstructions2>ATTN: DANA GRIFFIN</dealerinstructions2>
      <dealerinstructions3>INFO: 612</dealerinstructions3>
      <dealerinstructions4>ROUTE: ZNED UNTED</dealerinstructions4>
      <dealerinstructions5 />
      <dealerinstructions6 />
      <shippingInstructionsIndicator>N</shippingInstructionsIndicator>
      <shippingInstructions1 />
      <shippingInstructions2 />
      <shippingInstructions3 />
      <shippingInstructions4 />
      <shippingInstructions5 />
      <shippingInstructions6 />
      <specialInstructionsIndicator>N</specialInstructionsIndicator>
      <specialInstructions1 />
      <specialInstructions2 />
      <customeContainerDesc />
    - <carton>
      <deliveryCartonID>253370905995</deliveryCartonID>
      <shipIndicator>Y</shipIndicator>
      <deliveryPalletID>X150</deliveryPalletID>
      <consolidatedDeliveryCartonID />
      <scanDateTime>2009-12-03T19:36:12.00</scanDateTime>
      <cartonWeight>52</cartonWeight>
      <dropShipFlag>1</dropShipFlag>
      <carrierTrackingNumber />
      <carrierZoneID>0</carrierZoneID>
      <codAmount />
      <customerPackageAmount />
      <declaredValue />
      <residentialDeliveryIndicator />
      <serviceTypeCode>00</serviceTypeCode>
      <ssccCode>006860244400829393</ssccCode>
    - <Item>
      <shipPrefix>UNV</shipPrefix>
      <shipStockNumber>21200</shipStockNumber>
      <itemDescription>PAPER XERO/DUP WE LTR 20#</itemDescription>
      <orderQuantity>1</orderQuantity>
      <originalShipQuantity>1</originalShipQuantity>
      <shipQuantity>1</shipQuantity>
      <inventoryUnitCode>CT</inventoryUnitCode>
      <inventoryWeightQuantity>52.000</inventoryWeightQuantity>
      <upcNumber>00000000000000</upcNumber>
      <upcRetailCode>087547212004</upcRetailCode>
      <hazmatIndicator>N</hazmatIndicator>
      <serialRequiredIndicator>N</serialRequiredIndicator>
      <dealerMemoPO>S</dealerMemoPO>
      <cartonLineNumber>1</cartonLineNumber>
      <orderLineNumber>11</orderLineNumber>
      <originalOrderPrefix>UNV</originalOrderPrefix>
      <originalOrderStockNumber>21200</originalOrderStockNumber>
      <reasonCode />
    - <Item_Serial>
      <serialNumber />
      </Item_Serial>
        </Item>
      </carton>
       </order>
      </Delivery>
      </Shipment>
    This is not the complete XML file as it exceeds the 15000 characters and then I cann't post here. So I have deleted much part of it.
    The hierarchy is as following: Shipment->Delivery->Order->Carton->Item.
    I have created a XSLT for it which is working fine.
    But when I execute my report program it gives CX_SY_XSLT_FORMAT_ERROR saying that
    Transformation error:  Non-canonical structure of element name XML_OUTPUT.

    Dear experts,
    My report program is as following:-
    *& Report  Z_ASNTRNS
    REPORT  Z_ASNTRNS.
    *& Report  Z_ASNTRNS
    TYPE-POOLS: abap, ixml.
    TABLES: ZASN_SHIPMENT,ZASN_DELIVERY,ZASN_ORDER,ZASN_CARTON,ZASN_ITEM.
    *CONSTANTS gs_file TYPE string VALUE 'C:Documents and SettingsC5134126DesktopRajesh_kandakatlaSampleASNFile.xml'.
    This is the structure for the data from the XML file
    TYPES: BEGIN OF ts_item,
          ZSHIPMENT LIKE ZASN_ITEM-ZSHIPMENT,
          VBELN LIKE ZASN_ITEM-VBELN,
          ORDER_NUMBER LIKE ZASN_ITEM-ORDER_NUMBER,
          CARTON_ID LIKE ZASN_ITEM-CARTON_ID,
           ITEM LIKE ZASN_ITEM-ITEM,
           CARTON_LINE_NUM LIKE ZASN_ITEM-CARTON_LINE_NUM,
           CARTON_LINE_NUMBER LIKE ZASN_ITEM-CARTON_LINE_NUM,
          AEDAT(8),
          AEZET(6),
           ITEM_DESCRIPTION LIKE ZASN_ITEM-ITEM_DESCRIPTION,
           ORD_QTY(16),
           ORIGINAL_SHIP(16),
           SHIP_QTY(16),
           UPC_NUMBER LIKE ZASN_ITEM-UPC_NUMBER,
           DEALER_MEMO_PO(5),
           ORDER_LINE_NUM LIKE ZASN_ITEM-ORDER_LINE_NUM,
          STATUS LIKE ZASN_ITEM-STATUS,
           END OF ts_item.
    TYPES: BEGIN OF ts_carton,
          ZSHIPMENT LIKE ZASN_CARTON-ZSHIPMENT,
          VBELN LIKE ZASN_CARTON-VBELN,
          ORDER_NUMBER LIKE ZASN_CARTON-ORDER_NUMBER,
           CARTON_ID LIKE ZASN_CARTON-CARTON_ID,
          AEDAT(8),
          AEZET(6),
           SHIP_INDICATOR LIKE ZASN_CARTON-SHIP_INDICATOR,
           TRACKING_NUMBER LIKE ZASN_CARTON-TRACKING_NUMBER,
           ZZCARTON_WGT(18),
           Item type ts_item,
           END OF ts_carton.
    TYPES: BEGIN OF ts_order,
          ZSHIPMENT LIKE ZASN_ORDER-ZSHIPMENT,
          VBELN LIKE ZASN_ORDER-VBELN,
           ORDER_NUMBER LIKE ZASN_ORDER-ORDER_NUMBER,
          AEDAT(8),
          AEZET(6),
           SUB_ORDER LIKE ZASN_ORDER-SUB_ORDER,
           ORDER_DATE(8),
           PRIMARY_ORDER LIKE ZASN_ORDER-PRIMARY_ORDER,
           CUSTOMER_PO LIKE ZASN_ORDER-CUSTOMER_PO,
           PRIMARY_ID LIKE ZASN_ORDER-PRIMARY_ID,
           SHIP_TO LIKE ZASN_ORDER-SHIP_TO,
          ANZPK(5),
           carton type ts_carton,
           END OF ts_order.
    TYPES: BEGIN OF ts_delivery,
          ZSHIPMENT LIKE ZASN_DELIVERY-ZSHIPMENT,
          VBELN LIKE ZASN_DELIVERY-VBELN,
          AEDAT(8) TYPE C,
          AEZET(6) TYPE C,
           PRIMARY_CUSTOMER LIKE ZASN_DELIVERY-PRIMARY_CUSTOMER,
           BILL_OF_LADING LIKE ZASN_DELIVERY-BILL_OF_LADING,
           CARTON_COUNT(5),
           TOTAL_WEIGHT(18),
           order type ts_order,
           END OF ts_delivery.
    TYPES: BEGIN OF ts_shipment,
           ZSHIPMENT LIKE ZASN_SHIPMENT-ZSHIPMENT,
           MANIFEST_DATE_TIME(25),
          AEDAT(8) TYPE C,
          AEZET(6) TYPE C,
          SDATE(8) TYPE C,
          STIME(6) TYPE C,
           SFACILITY_NUMBER LIKE ZASN_SHIPMENT-SFACILITY_NUMBER,
           ZZCARRIERCODE LIKE ZASN_SHIPMENT-ZZCARRIERCODE,
           Delivery type ts_delivery,
           END OF ts_shipment.
    TYPES: BEGIN OF ts_shipment1,
           ZSHIPMENT LIKE ZASN_SHIPMENT-ZSHIPMENT,
           MANIFEST_DATE_TIME(25),
           SFACILITY_NUMBER LIKE ZASN_SHIPMENT-SFACILITY_NUMBER,
           ZZCARRIERCODE LIKE ZASN_SHIPMENT-ZZCARRIERCODE,
           PRIMARY_CUSTOMER LIKE ZASN_DELIVERY-PRIMARY_CUSTOMER,
           BILL_OF_LADING LIKE ZASN_DELIVERY-BILL_OF_LADING,
           CARTON_COUNT(5),
           TOTAL_WEIGHT(18),
           ORDER_NUMBER LIKE ZASN_ORDER-ORDER_NUMBER,
           SUB_ORDER LIKE ZASN_ORDER-SUB_ORDER,
           ORDER_DATE(8),
           PRIMARY_ORDER LIKE ZASN_ORDER-PRIMARY_ORDER,
           CUSTOMER_PO LIKE ZASN_ORDER-CUSTOMER_PO,
           PRIMARY_ID LIKE ZASN_ORDER-PRIMARY_ID,
           SHIP_TO LIKE ZASN_ORDER-SHIP_TO,
           CARTON_ID LIKE ZASN_CARTON-CARTON_ID,
           SHIP_INDICATOR LIKE ZASN_CARTON-SHIP_INDICATOR,
           TRACKING_NUMBER LIKE ZASN_CARTON-TRACKING_NUMBER,
           ZZCARTON_WGT(18),
           ITEM LIKE ZASN_ITEM-ITEM,
           CARTON_LINE_NUM LIKE ZASN_ITEM-CARTON_LINE_NUM,
           CARTON_LINE_NUMBER LIKE ZASN_ITEM-CARTON_LINE_NUM,
           ITEM_DESCRIPTION LIKE ZASN_ITEM-ITEM_DESCRIPTION,
           ORD_QTY(16),
           ORIGINAL_SHIP(16),
           SHIP_QTY(16),
           UPC_NUMBER LIKE ZASN_ITEM-UPC_NUMBER,
           DEALER_MEMO_PO(5),
           ORDER_LINE_NUM LIKE ZASN_ITEM-ORDER_LINE_NUM,
           END OF ts_shipment1.
    TYPES: BEGIN OF t_xml_line,
                 data(256) TYPE x,
               END OF t_xml_line.
    *Typdefinition für Airplus
    *READ THE DOCUMENTATION "LASG_XML_INVOICE_BTM"!!!
    VARs beginning with "a_" are ATTRIBUTES
    DATA: l_ixml            TYPE REF TO if_ixml,
          l_streamfactory   TYPE REF TO if_ixml_stream_factory,
          l_parser          TYPE REF TO if_ixml_parser,
          l_istream         TYPE REF TO if_ixml_istream,
          l_ostream         TYPE REF TO if_ixml_ostream,
          l_document        TYPE REF TO if_ixml_document,
          l_node            TYPE REF TO if_ixml_node,
          l_xml TYPE REF TO cl_xml_document,
          l_xmldata         TYPE string.
    DATA: l_xml_table       TYPE TABLE OF t_xml_line,
             l_xml_line        TYPE t_xml_line,
             l_xml_table_size  TYPE i.
    DATA: l_filename        TYPE string.
    DATA: xml_out TYPE string ,
          size type i.
    DATA: l_xml_x1   TYPE xstring.
    DATA: l_len      TYPE i,
              l_len2     TYPE i,
              l_tab      TYPE tsfixml,
              l_content  TYPE string,
              l_str1     TYPE string,
              c_conv     TYPE REF TO cl_abap_conv_in_ce.
             l_itab     TYPE TABLE OF string.
    DATA: BEGIN OF l_itab occurs 0,
          data(256) type c,
          end of l_itab.
    TYPES : BEGIN OF TY_TEXT,
              data(255) type C,
            END OF TY_TEXT.
    DATA: F_XML TYPE STRING.
    DATA : LT_TEXT_OUT type table of TY_TEXT with header line.
    tables
    DATA: it_shipment    TYPE STANDARD TABLE OF ts_shipment,
          wa_shipment    TYPE  ts_shipment.
    *Errorvariables
    DATA: xslt_err   TYPE REF TO cx_xslt_exception,
          err_string TYPE string.
    PARAMETERS: pa_file TYPE localfile OBLIGATORY
    DEFAULT 'C:Documents and SettingsC5134126DesktopRajesh_kandakatlaSampleASNFile.xml'.
    START-OF-SELECTION.
      Creating the main iXML factory
        l_ixml = cl_ixml=>create( ).
      Creating a stream factory
        l_streamfactory = l_ixml->create_stream_factory( ).
        PERFORM get_xml_table CHANGING l_xml_table_size l_xml_table.
    here we use the CALL TRANSFORMATION method which calls
    the XSLT program "z_asnfile"
    TRY.
        CALL TRANSFORMATION ('Z_ASNFILE')
          SOURCE xml LT_TEXT_OUT[]
          RESULT xml_output = it_shipment
    catch any error, very helpful if the XSLT isn't correct
        CATCH cx_xslt_exception INTO xslt_err.
        err_string = xslt_err->get_text( ).
        WRITE: / 'Transformation error: ', err_string.
        EXIT.
      ENDTRY." setting a breakpoint to watch the workarea
    by the internal table "it_airplus"
           break-point.
      LOOP AT it_shipment INTO wa_shipment.
      ENDLOOP.
    *&      Form  get_xml_table
      FORM get_xml_table CHANGING l_xml_table_size TYPE i
                                  l_xml_table      TYPE STANDARD TABLE.
        l_filename = pa_file.
      upload a file from the client's workstation
        CALL METHOD cl_gui_frontend_services=>gui_upload
          EXPORTING
            filename   = l_filename
            filetype   = 'BIN'
          IMPORTING
            filelength = l_xml_table_size
          CHANGING
            data_tab   = l_xml_table
          EXCEPTIONS
            OTHERS     = 19.
        IF sy-subrc <> 0.
          MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                     WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        ENDIF.
    Convert binary to text.
    CALL FUNCTION 'SCMS_BINARY_TO_TEXT'
      EXPORTING
        INPUT_LENGTH          = 70000
                FIRST_LINE            = 0
                LAST_LINE             = 0
                APPEND_TO_TABLE       = ' '
                MIMETYPE              = ' '
        WRAP_LINES            = 'X'
              IMPORTING
                OUTPUT_LENGTH         =
      TABLES
        BINARY_TAB            = l_xml_table
        TEXT_TAB              = LT_TEXT_OUT
              EXCEPTIONS
                FAILED                = 1
                OTHERS                = 2
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    ENDFORM.                    "get_xml_table

  • Insert XML file into Relational database model - no XMLTYPE!

    Dear all,
    How can I store a known complex XML file into an existing relational database WITHOUT using xmltypes in the database ?
    I read the article on DBMS_XMLSTORE. DBMS_XMLSTORE indeed partially bridges the gap between XML and RDBMS to a certain extent, namely for simply structured XML (canonical structure) and simple tables.
    However, when the XML structure will become arbitrary and rapidly evolving, surely there must be a way to map XML to a relational model more flexibly.
    We work in a java/Oracle10 environment that receives very large XML documents from an independent data management source. These files comply with an XML schema. That is all we know. Still, all these data must be inserted/updated daily in an existing relational model. Quite an assignment isn't it ?
    The database does and will not contain XMLTYPES, only plain RDBMS tables.
    Are you aware of a framework/product or tool to do what DBMS_XMLSTORE does but with any format of XML file ? If not, I am doomed.
    Cheers.
    Luc.
    Edited by: user6693852 on Jan 13, 2009 7:02 AM

    In case you decide to follow my advice, here's a simple example showing how to do this.. (Note the XMLTable syntax is the preferred approach in 10gr2 and later..
    SQL> spool testase.log
    SQL> --
    SQL> connect / as sysdba
    Connected.
    SQL> --
    SQL> set define on
    SQL> set timing on
    SQL> --
    SQL> define USERNAME = XDBTEST
    SQL> --
    SQL> def PASSWORD = XDBTEST
    SQL> --
    SQL> def USER_TABLESPACE = USERS
    SQL> --
    SQL> def TEMP_TABLESPACE = TEMP
    SQL> --
    SQL> drop user &USERNAME cascade
      2  /
    old   1: drop user &USERNAME cascade
    new   1: drop user XDBTEST cascade
    User dropped.
    Elapsed: 00:00:00.59
    SQL> grant create any directory, drop any directory, connect, resource, alter session, create view to &USERNAME identified by &PASS
    ORD
      2  /
    old   1: grant create any directory, drop any directory, connect, resource, alter session, create view to &USERNAME identified by &
    ASSWORD
    new   1: grant create any directory, drop any directory, connect, resource, alter session, create view to XDBTEST identified by XDB
    EST
    Grant succeeded.
    Elapsed: 00:00:00.01
    SQL> alter user &USERNAME default tablespace &USER_TABLESPACE temporary tablespace &TEMP_TABLESPACE
      2  /
    old   1: alter user &USERNAME default tablespace &USER_TABLESPACE temporary tablespace &TEMP_TABLESPACE
    new   1: alter user XDBTEST default tablespace USERS temporary tablespace TEMP
    User altered.
    Elapsed: 00:00:00.00
    SQL> connect &USERNAME/&PASSWORD
    Connected.
    SQL> --
    SQL> var SCHEMAURL varchar2(256)
    SQL> var XMLSCHEMA CLOB
    SQL> --
    SQL> set define off
    SQL> --
    SQL> begin
      2    :SCHEMAURL := 'http://xmlns.example.com/askTom/TransactionList.xsd';
      3    :XMLSCHEMA :=
      4  '<?xml version="1.0" encoding="UTF-8"?>
      5  <!--W3C Schema generated by XMLSpy v2008 rel. 2 sp2 (http://www.altova.com)-->
      6  <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:xdb="http://xmlns.oracle.com/xdb" xdb:storeVarrayAsTable="true">
      7     <xs:element name="TransactionList" type="transactionListType" xdb:defaultTable="LOCAL_TABLE"/>
      8     <xs:complexType name="transactionListType"  xdb:maintainDOM="false" xdb:SQLType="TRANSACTION_LIST_T">
      9             <xs:sequence>
    10                     <xs:element name="Transaction" type="transactionType" maxOccurs="unbounded" xdb:SQLCollType="TRANSACTION_V"
    >
    11             </xs:sequence>
    12     </xs:complexType>
    13     <xs:complexType name="transactionType" xdb:maintainDOM="false"  xdb:SQLType="TRANSACTION_T">
    14             <xs:sequence>
    15                     <xs:element name="TradeVersion" type="xs:integer"/>
    16                     <xs:element name="TransactionId" type="xs:integer"/>
    17                     <xs:element name="Leg" type="legType" maxOccurs="unbounded" xdb:SQLCollType="LEG_V"/>
    18             </xs:sequence>
    19             <xs:attribute name="id" type="xs:integer" use="required"/>
    20     </xs:complexType>
    21     <xs:complexType name="paymentType"  xdb:maintainDOM="false" xdb:SQLType="PAYMENT_T">
    22             <xs:sequence>
    23                     <xs:element name="StartDate" type="xs:date"/>
    24                     <xs:element name="Value" type="xs:integer"/>
    25             </xs:sequence>
    26             <xs:attribute name="id" type="xs:integer" use="required"/>
    27     </xs:complexType>
    28     <xs:complexType name="legType"  xdb:maintainDOM="false"  xdb:SQLType="LEG_T">
    29             <xs:sequence>
    30                     <xs:element name="LegNumber" type="xs:integer"/>
    31                     <xs:element name="Basis" type="xs:integer"/>
    32                     <xs:element name="FixedRate" type="xs:integer"/>
    33                     <xs:element name="Payment" type="paymentType" maxOccurs="unbounded"  xdb:SQLCollType="PAYMENT_V"/>
    34             </xs:sequence>
    35             <xs:attribute name="id" type="xs:integer" use="required"/>
    36     </xs:complexType>
    37  </xs:schema>';
    38  end;
    39  /
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:00.00
    SQL> set define on
    SQL> --
    SQL> declare
      2    res boolean;
      3    xmlSchema xmlType := xmlType(:XMLSCHEMA);
      4  begin
      5    dbms_xmlschema.registerSchema
      6    (
      7      schemaurl => :schemaURL,
      8      schemadoc => xmlSchema,
      9      local     => TRUE,
    10      genTypes  => TRUE,
    11      genBean   => FALSE,
    12      genTables => TRUE,
    13      ENABLEHIERARCHY => DBMS_XMLSCHEMA.ENABLE_HIERARCHY_NONE
    14    );
    15  end;
    16  /
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:00.26
    SQL> desc LOCAL_TABLE
    Name                                                                   Null?    Type
    TABLE of SYS.XMLTYPE(XMLSchema "http://xmlns.example.com/askTom/TransactionList.xsd" Element "TransactionList") STORAGE Object-rela
    ional TYPE "TRANSACTION_LIST_T"
    SQL> --
    SQL> create or replace VIEW TRAN_VIEW
      2  as
      3  select
      4    extractvalue(x.column_value,'/Transaction/TradeVersion/text()') tradeversion,
      5    extractvalue(x.column_value,'/Transaction//text()') transactionid
      6  from
      7    local_table,
      8    table(xmlsequence(extract(OBJECT_VALUE,'/TransactionList/Transaction'))) x
      9  /
    View created.
    Elapsed: 00:00:00.01
    SQL> create or replace VIEW TRAN_LEG_VIEW
      2  as
      3  select
      4    extractvalue(x.column_value,'/Transaction/TransactionId/text()') transactionid,
      5    extractvalue(y.column_value,'/Leg/Basis/text()') leg_basis,
      6    extractValue(y.column_value,'/Leg/FixedRate/text()') leg_fixedrate
      7  from
      8    local_table,
      9    table(xmlsequence(extract(OBJECT_VALUE,'/TransactionList/Transaction'))) x,
    10    table(xmlsequence(extract(x.column_value,'/Transaction/Leg'))) y
    11  /
    View created.
    Elapsed: 00:00:00.01
    SQL> create or replace VIEW TRAN_LEG_PAY_VIEW
      2  as
      3  select
      4    extractvalue(x.column_value,'/Transaction/TransactionId/text()') transactionid,
      5    extractvalue(y.column_value,'/Leg/LegNumber/text()') leg_legnumber,
      6    extractvalue(z.column_value,'/Payment/StartDate/text()') pay_startdate,
      7    extractValue(z.column_value,'/Payment/Value/text()') pay_value
      8  from
      9    local_table,
    10    table(xmlsequence(extract(OBJECT_VALUE,'/TransactionList/Transaction'))) x,
    11    table(xmlsequence(extract(x.column_value,'/Transaction/Leg'))) y,
    12    table(xmlsequence(extract(y.column_value,'/Leg/Payment'))) z
    13  /
    View created.
    Elapsed: 00:00:00.03
    SQL> desc TRAN_VIEW
    Name                                                                   Null?    Type
    TRADEVERSION                                                                    NUMBER(38)
    TRANSACTIONID                                                                   VARCHAR2(4000)
    SQL> --
    SQL> desc TRAN_LEG_VIEW
    Name                                                                   Null?    Type
    TRANSACTIONID                                                                   NUMBER(38)
    LEG_BASIS                                                                       NUMBER(38)
    LEG_FIXEDRATE                                                                   NUMBER(38)
    SQL> --
    SQL> desc TRAN_LEG_PAY_VIEW
    Name                                                                   Null?    Type
    TRANSACTIONID                                                                   NUMBER(38)
    LEG_LEGNUMBER                                                                   NUMBER(38)
    PAY_STARTDATE                                                                   DATE
    PAY_VALUE                                                                       NUMBER(38)
    SQL> --
    SQL> create or replace VIEW TRAN_VIEW_XMLTABLE
      2  as
      3  select t.*
      4    from LOCAL_TABLE,
      5         XMLTable
      6         (
      7            '/TransactionList/Transaction'
      8            passing OBJECT_VALUE
      9            columns
    10            TRADE_VERSION  NUMBER(4) path 'TradeVersion/text()',
    11            TRANSACTION_ID NUMBER(4) path 'TransactionId/text()'
    12         ) t
    13  /
    View created.
    Elapsed: 00:00:00.01
    SQL> create or replace VIEW TRAN_LEG_VIEW_XMLTABLE
      2  as
      3  select t.TRANSACTION_ID, L.*
      4    from LOCAL_TABLE,
      5         XMLTable
      6         (
      7            '/TransactionList/Transaction'
      8            passing OBJECT_VALUE
      9            columns
    10            TRANSACTION_ID NUMBER(4) path 'TransactionId/text()',
    11            LEG            XMLType   path 'Leg'
    12         ) t,
    13         XMLTABLE
    14         (
    15           '/Leg'
    16           passing LEG
    17           columns
    18           LEG_NUMBER     NUMBER(4) path 'LegNumber/text()',
    19           LEG_BASIS      NUMBER(4) path 'Basis/text()',
    20           LEG_FIXED_RATE NUMBER(4) path 'FixedRate/text()'
    21         ) l
    22  /
    View created.
    Elapsed: 00:00:00.01
    SQL> create or replace VIEW TRAN_LEG_PAY_VIEW_XMLTABLE
      2  as
      3  select TRANSACTION_ID, L.LEG_NUMBER, P.*
      4    from LOCAL_TABLE,
      5         XMLTable
      6         (
      7            '/TransactionList/Transaction'
      8            passing OBJECT_VALUE
      9            columns
    10            TRANSACTION_ID NUMBER(4) path 'TransactionId/text()',
    11            LEG            XMLType   path 'Leg'
    12         ) t,
    13         XMLTABLE
    14         (
    15           '/Leg'
    16           passing LEG
    17           columns
    18           LEG_NUMBER     NUMBER(4) path 'LegNumber/text()',
    19           PAYMENT        XMLType   path 'Payment'
    20         ) L,
    21         XMLTABLE
    22         (
    23           '/Payment'
    24           passing PAYMENT
    25           columns
    26           PAY_START_DATE     DATE      path 'StartDate/text()',
    27           PAY_VALUE          NUMBER(4) path 'Value/text()'
    28         ) p
    29  /
    View created.
    Elapsed: 00:00:00.03
    SQL> desc TRAN_VIEW_XMLTABLE
    Name                                                                   Null?    Type
    TRADE_VERSION                                                                   NUMBER(4)
    TRANSACTION_ID                                                                  NUMBER(4)
    SQL> --
    SQL> desc TRAN_LEG_VIEW_XMLTABLE
    Name                                                                   Null?    Type
    TRANSACTION_ID                                                                  NUMBER(4)
    LEG_NUMBER                                                                      NUMBER(4)
    LEG_BASIS                                                                       NUMBER(4)
    LEG_FIXED_RATE                                                                  NUMBER(4)
    SQL> --
    SQL> desc TRAN_LEG_PAY_VIEW_XMLTABLE
    Name                                                                   Null?    Type
    TRANSACTION_ID                                                                  NUMBER(4)
    LEG_NUMBER                                                                      NUMBER(4)
    PAY_START_DATE                                                                  DATE
    PAY_VALUE                                                                       NUMBER(4)
    SQL> --
    SQL> set long 10000 pages 100 lines 128
    SQL> set timing on
    SQL> set autotrace on explain
    SQL> set heading on feedback on
    SQL> --
    SQL> VAR DOC1 CLOB
    SQL> VAR DOC2 CLOB
    SQL> --
    SQL> begin
      2    :DOC1 :=
      3  '<TransactionList>
      4    <Transaction id="1">
      5      <TradeVersion>1</TradeVersion>
      6      <TransactionId>1</TransactionId>
      7      <Leg id="1">
      8        <LegNumber>1</LegNumber>
      9        <Basis>1</Basis>
    10        <FixedRate>1</FixedRate>
    11        <Payment id="1">
    12          <StartDate>2000-01-01</StartDate>
    13          <Value>1</Value>
    14        </Payment>
    15        <Payment id="2">
    16          <StartDate>2000-01-02</StartDate>
    17          <Value>2</Value>
    18        </Payment>
    19      </Leg>
    20      <Leg id="2">
    21        <LegNumber>2</LegNumber>
    22        <Basis>2</Basis>
    23        <FixedRate>2</FixedRate>
    24        <Payment id="1">
    25          <StartDate>2000-02-01</StartDate>
    26          <Value>10</Value>
    27        </Payment>
    28        <Payment id="2">
    29          <StartDate>2000-02-02</StartDate>
    30          <Value>20</Value>
    31        </Payment>
    32      </Leg>
    33    </Transaction>
    34    <Transaction id="2">
    35      <TradeVersion>2</TradeVersion>
    36      <TransactionId>2</TransactionId>
    37      <Leg id="1">
    38        <LegNumber>21</LegNumber>
    39        <Basis>21</Basis>
    40        <FixedRate>21</FixedRate>
    41        <Payment id="1">
    42          <StartDate>2002-01-01</StartDate>
    43          <Value>21</Value>
    44        </Payment>
    45        <Payment id="2">
    46          <StartDate>2002-01-02</StartDate>
    47          <Value>22</Value>
    48        </Payment>
    49      </Leg>
    50      <Leg id="22">
    51        <LegNumber>22</LegNumber>
    52        <Basis>22</Basis>
    53        <FixedRate>22</FixedRate>
    54        <Payment id="21">
    55          <StartDate>2002-02-01</StartDate>
    56          <Value>210</Value>
    57        </Payment>
    58        <Payment id="22">
    59          <StartDate>2002-02-02</StartDate>
    60          <Value>220</Value>
    61        </Payment>
    62      </Leg>
    63    </Transaction>
    64  </TransactionList>';
    65    :DOC2 :=
    66  '<TransactionList>
    67    <Transaction id="31">
    68      <TradeVersion>31</TradeVersion>
    69      <TransactionId>31</TransactionId>
    70      <Leg id="31">
    71        <LegNumber>31</LegNumber>
    72        <Basis>31</Basis>
    73        <FixedRate>31</FixedRate>
    74        <Payment id="31">
    75          <StartDate>3000-01-01</StartDate>
    76          <Value>31</Value>
    77        </Payment>
    78      </Leg>
    79    </Transaction>
    80  </TransactionList>';
    81  end;
    82  /
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:00.01
    SQL> insert into LOCAL_TABLE values ( xmltype(:DOC1))
      2  /
    1 row created.
    Elapsed: 00:00:00.01
    Execution Plan
    Plan hash value: 1
    | Id  | Operation                | Name        | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | INSERT STATEMENT         |             |     1 |   100 |     1   (0)| 00:00:01 |
    |   1 |  LOAD TABLE CONVENTIONAL | LOCAL_TABLE |       |       |            |          |
    SQL> insert into LOCAL_TABLE values ( xmltype(:DOC2))
      2  /
    1 row created.
    Elapsed: 00:00:00.01
    Execution Plan
    Plan hash value: 1
    | Id  | Operation                | Name        | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | INSERT STATEMENT         |             |     1 |   100 |     1   (0)| 00:00:01 |
    |   1 |  LOAD TABLE CONVENTIONAL | LOCAL_TABLE |       |       |            |          |
    SQL> select * from TRAN_VIEW_XMLTABLE
      2  /
    TRADE_VERSION TRANSACTION_ID
                1              1
                2              2
               31             31
    3 rows selected.
    Elapsed: 00:00:00.03
    Execution Plan
    Plan hash value: 650975545
    | Id  | Operation          | Name                           | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT   |                                |     3 |   168 |     3   (0)| 00:00:01 |
    |   1 |  NESTED LOOPS      |                                |     3 |   168 |     3   (0)| 00:00:01 |
    |*  2 |   TABLE ACCESS FULL| SYS_NTGgl+TKyhQnWoFRSrCxeX9g== |     3 |   138 |     3   (0)| 00:00:01 |
    |*  3 |   INDEX UNIQUE SCAN| SYS_C0010174                   |     1 |    10 |     0   (0)| 00:00:01 |
    Predicate Information (identified by operation id):
       2 - filter("SYS_NC_TYPEID$" IS NOT NULL)
       3 - access("NESTED_TABLE_ID"="LOCAL_TABLE"."SYS_NC0000800009$")
    Note
       - dynamic sampling used for this statement
    SQL> select * from TRAN_LEG_VIEW_XMLTABLE
      2  /
    TRANSACTION_ID LEG_NUMBER  LEG_BASIS LEG_FIXED_RATE
                 1          1          1              1
                 1          2          2              2
                 2         21         21             21
                 2         22         22             22
                31         31         31             31
    5 rows selected.
    Elapsed: 00:00:00.04
    Execution Plan
    Plan hash value: 1273661583
    | Id  | Operation           | Name                           | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT    |                                |     5 |   560 |     7  (15)| 00:00:01 |
    |*  1 |  HASH JOIN          |                                |     5 |   560 |     7  (15)| 00:00:01 |
    |   2 |   NESTED LOOPS      |                                |     3 |   159 |     3   (0)| 00:00:01 |
    |*  3 |    TABLE ACCESS FULL| SYS_NTGgl+TKyhQnWoFRSrCxeX9g== |     3 |   129 |     3   (0)| 00:00:01 |
    |*  4 |    INDEX UNIQUE SCAN| SYS_C0010174                   |     1 |    10 |     0   (0)| 00:00:01 |
    |*  5 |   TABLE ACCESS FULL | SYS_NTUmyermF/S721C/2UXo40Uw== |     5 |   295 |     3   (0)| 00:00:01 |
    Predicate Information (identified by operation id):
       1 - access("SYS_ALIAS_1"."NESTED_TABLE_ID"="SYS_ALIAS_0"."SYS_NC0000800009$")
       3 - filter("SYS_NC_TYPEID$" IS NOT NULL)
       4 - access("NESTED_TABLE_ID"="LOCAL_TABLE"."SYS_NC0000800009$")
       5 - filter("SYS_NC_TYPEID$" IS NOT NULL)
    Note
       - dynamic sampling used for this statement
    SQL> select * from TRAN_LEG_PAY_VIEW_XMLTABLE
      2  /
    TRANSACTION_ID LEG_NUMBER PAY_START  PAY_VALUE
                 1          1 01-JAN-00          1
                 1          1 02-JAN-00          2
                 1          2 01-FEB-00         10
                 1          2 02-FEB-00         20
                 2         21 01-JAN-02         21
                 2         21 02-JAN-02         22
                 2         22 01-FEB-02        210
                 2         22 02-FEB-02        220
                31         31 01-JAN-00         31
    9 rows selected.
    Elapsed: 00:00:00.07
    Execution Plan
    Plan hash value: 4004907785
    | Id  | Operation            | Name                           | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT     |                                |     9 |  1242 |    10  (10)| 00:00:01 |
    |*  1 |  HASH JOIN           |                                |     9 |  1242 |    10  (10)| 00:00:01 |
    |*  2 |   HASH JOIN          |                                |     5 |   480 |     7  (15)| 00:00:01 |
    |   3 |    NESTED LOOPS      |                                |     3 |   159 |     3   (0)| 00:00:01 |
    |*  4 |     TABLE ACCESS FULL| SYS_NTGgl+TKyhQnWoFRSrCxeX9g== |     3 |   129 |     3   (0)| 00:00:01 |
    |*  5 |     INDEX UNIQUE SCAN| SYS_C0010174                   |     1 |    10 |     0   (0)| 00:00:01 |
    |*  6 |    TABLE ACCESS FULL | SYS_NTUmyermF/S721C/2UXo40Uw== |     5 |   215 |     3   (0)| 00:00:01 |
    |*  7 |   TABLE ACCESS FULL  | SYS_NTelW4ZRtKS+WKqCaXhsHnNQ== |     9 |   378 |     3   (0)| 00:00:01 |
    Predicate Information (identified by operation id):
       1 - access("NESTED_TABLE_ID"="SYS_ALIAS_1"."SYS_NC0000900010$")
       2 - access("SYS_ALIAS_1"."NESTED_TABLE_ID"="SYS_ALIAS_0"."SYS_NC0000800009$")
       4 - filter("SYS_NC_TYPEID$" IS NOT NULL)
       5 - access("NESTED_TABLE_ID"="LOCAL_TABLE"."SYS_NC0000800009$")
       6 - filter("SYS_NC_TYPEID$" IS NOT NULL)
       7 - filter("SYS_NC_TYPEID$" IS NOT NULL)
    Note
       - dynamic sampling used for this statement
    SQL>Out of interest why are you so against using XMLType...
    Edited by: mdrake on Jan 13, 2009 8:25 AM

  • Insert XML file into Relational database model without using XMLTYPE tables

    Dear all,
    How can I store a known complex XML file into an existing relational database WITHOUT using xmltypes in the database ?
    I read the article on DBMS_XMLSTORE. DBMS_XMLSTORE indeed partially bridges the gap between XML and RDBMS to a certain extent, namely for simply structured XML (canonical structure) and simple tables.
    However, when the XML structure will become arbitrary and rapidly evolving, surely there must be a way to map XML to a relational model more flexibly.
    We work in a java/Oracle10 environment that receives very large XML documents from an independent data management source. These files comply with an XML schema. That is all we know. Still, all these data must be inserted/updated daily in an existing relational model. Quite an assignment isn't it ?
    The database does and will not contain XMLTYPES, only plain RDBMS tables.
    Are you aware of a framework/product or tool to do what DBMS_XMLSTORE does but with any format of XML file ? If not, I am doomed.
    Constraints : Input via XML files defined by third party
    Storage : relational database model with hundreds of tables and thousands of existing queries that can not be touched. The model must not be altered.
    Target : get this XML into the database on a daily basis via an automated process.
    Cheers.
    Luc.

    Luc,
    your Doomed !
    If you would try something like DBMS_XMLSTORE, you probably would be into serious performance problems in your case, very fast, and it would be very difficult to manage.
    If you would use a little bit of XMLType stuff, you would be able to shred the data into the relational model very fast and controlable. Take it from me, I am one of those old geezers like Mr. Tom Kyte way beyond 40 years (still joking). No seriously. I started out as a classical PL/SQL, Forms Guy that switched after two years to become a "DBA 1.0" and Mr Codd and Mr Date were for years my biggest hero's. I have the utmost respect for Mr. Tom Kyte for all his efforts for bringing the concepts manual into the development world. Just to name some off the names that influenced me. But you will have to work with UNSTRUCTURED data (as Mr Date would call it). 80% of the data out there exists off it. Stuff like XMLTABLE and XML VIEWs bridge the gap between that unstructured world and the relational world. It is very doable to drag and drop an XML file into the XMLDB database into a XMLtype table and or for instance via FTP. From that point on it is in the database. From there you could move into relational tables via XMLTABLE methods or XML Views.
    You could see the described method as a filtering option so XML can be transformed into relational data. If you don't want any XML in your current database, then create an small Oracle database with XML DB installed (if doable 11.1.0.7 regarding the best performance --> all the new fast optimizer stuff etc). Use that database as a staging area that does all the XML shredding for you into relational components and ship the end result via database links and or materialized views or other probably known methodes into your relational database that isn't allowed to have XMLType.
    This way you would keep your realtional Oracle database clean and have the Oracle XML DB staging database do all the filtering and shredding into relational components.
    Throwing the XML DB option out off the window beforehand would be like replacing your Mercedes with a bicycle. With both you will be able to travel the distance from Paris to Rome, but it will take you a hell of lot longer.
    :-)

  • How to handle duplicate nodes in xslt mapping

    Hi,
    in my scenario that i have a source mapped to the canonical structure and from canonical to target.
    source structure
    <empid>
    <national_id>
    canonical structure
    <id> 0 to unbounded
        <id>
        <type>
    the mapping from source to canonical
    is like i have duplicated the canonical structure and then mapped
    id -
    empid
    type--- assigned a constant employee
    id -
    national_id
    type--- assigned a constant National
    i have used xslt mapping using stylus studio and mapping from source to canonical is not the problem
    when i mapped the canonical to target there is a problem
    the node id is visible once in the canonical when the structure is a source  
    but there is a dupicate node in the structure but not visible. how to procedue with the mapping for canonical to target
    kindly help me
    with regards
    pradeep N

    hi,
    Udo Martens.
    <b>source xml</b>
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:Materail_fileReceive xmlns:ns0="http://www.aprilbiztec.in/MM_MultiFile">
       <Emp_ID>12</Emp_ID>
       <National_ID>91</National_ID>
    </ns0:Materail_fileReceive>
    <b>canonical xml</b>
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:Materail_file xmlns:ns0="http://www.aprilbiztec.in/MM_MultiFile"><Material><ID>12</ID><Type>Employee</Type></Material><Material><ID>91</ID><Type>National</Type></Material></ns0:Materail_file>
    <b>source to canonical mapping is</b>
    <?xml version='1.0' ?>
    <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:ns0="http://www.aprilbiztec.in/MM_MultiFile">
         <xsl:template match="/">
              <ns0:Materail_file xmlns:ns0="http://www.aprilbiztec.in/MM_MultiFile">
                   <Material>
                        <ID>
                             <xsl:value-of select="ns0:Materail_fileReceive/Emp_ID"/>
                        </ID>
                        <Type>Employee</Type>
                   </Material>
                   <Material>
                        <ID>
                             <xsl:value-of select="ns0:Materail_fileReceive/National_ID"/>
                        </ID>
                        <Type>National</Type>
                   </Material>
              </ns0:Materail_file>
         </xsl:template>
    i want the<b> canonical to target mapping</b> where my target structure is similar to that of my source
    kindly help.
    regards
    pradeep N.

  • Outbound ABAP Client Proxy

    Hi,
    Scenario - Client Proxy to JDBC.
    I am getting Purchase order details out of Client proxy which goes and updates multiple tables at JDBC. I am doing this via single service interface to update multiple tables ( multiple statement element in JDBC target structure ).
    Client Proxy
    PO HEADER
    PO DETAIL
    PO TEXT
    PO ITEM
    JDBC tables
    tbl_HEADER
    tbl_DETAIL
    tbl_TEXT
    tbl_ITEM
    Detail and ITEM tables from SAP has too many entries. When I tested in development for 14 PO header it had 1500 detaills, 500 text and 800 items.  So i am thinking of breaking down of these PO headers in abap client proxy to trigger the abap proxy for every 50 PO header. So that it would not end up with huge PO details and text.
    I am thinking of modifyin the abap code to send 50 PO header each time in client proxy.
    loop on PO headers.
    if lv_counter is 50.
    call method to send via client proxy.
    then next 50 records. so that we are sending in chunks. 
    PI would receive in multiple messages with consisting of 50 PO header and many item details. Does update/insert on JDBC table would affect. what precautions shd i take in PI considering update on JDBC
    Kindly suggest with any inputs. 
    Thanks in advance.

    Hi,
          You can use the update_insert option of the receiver jdbc canonical structure.
    This would ensure that rows already existing would be updated and only the delta would be inserted into the tables.
    Regards

  • Error in a formula function

    Hi,
    i ran the infopackge.load was failed.failed request is showing Record 1 :Error in a formula function (routine ROUTINE_0001 ), record 1 .why this error is coming.how can i solve this issue?
    Thanks,
    Ali

    Hi,
    As refered by the posts above , you need to edit the data in to PSA, now try going through the procedure below:
    1. Once data is loaded till PSA , ther will be a "process Manually Tab" in the status of the ip, press that.
    2. Now you will get the error in the processing packets .
    3.make the IP red , delete the failed request from target( Info cube), sice for DSO there is a specific procedure.
    4.now go bak to ip click on the psa tab, the error displays record '1', so select the first record in PSA, put the size of the record in the no. of error records and select " error record".
    5. now you will se this error record in the PSA.
    6. Click on the edit record , you will se the canonical structure of the record with its field.
    7. No try editing th record for there might be a entry which seems to be diff from other(compare with correctly loaded records).
    8. edit it , save the changes.
    9. now either recostructing the PSA or manually pushing the failed request to target.
    Hope this will serve your cause
    Thax & Regards
    Vaibhave Sharma

  • Cursor on xsql

    Hi, I'm using the cursor element to get master detail xml pages.
    I'm doing something like:
    select a.a, a.b, cursor (
    select b.c, b.d from detail where a.id_detail = b.id_detail from detal b) as c
    from master a
    The XML I get, in the detail sections is like:
    <c>
    <c_row>
    </c_row>
    </c>
    What I want, is to change the name of the c_row element to another one I want.
    Thanks in advance.

    You can change the C part of <C_ROW> but you cannot change the _ROW part. You'll need to apply an XSLT transformation to transform the canonical structure into another format with the appropriate tags. The XSQL Servlet can make this easy to do for you.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • FCPX importing canon t3i, FCPX doesn't recognize the structure?

    I'm inserting my card from my new canon t3i, FCPX doesn't recognize the card?

    I have an idea.
    Create a folder on you desk top. Give it a name that ties it to images and video you shot using the t3i plus the date of the shoot. Connect your USB to your camera. Open Image capture. Turn t3i on. You should be able to transfer both images and video to the folder on your desktop via image capture. Check the folder to make sure your images and videos are there. Then turn off your camera. (If you get a camera image on your desktop, you do not need to use image capture.) Open F.C.P.X. Under file-Import-file, then under import files-look for your Desktop-then the folder with with stills & video-open folder-then highlight the video files, then import.
    I've been doing this, using flash memory cards, in dual mode cameras. It has worked for me with out a problem.
    Remember, this is just an idea.
    Hope it works for you.

  • Canon EOS 5D Mark II with Adobe Bridge (CS4) and Camera Raw 5.4

    I use a Canon EOS 5D Mark II and store in RAW-format.
    The camera is updated with latest software etc.
    I have the Adobe Design CS4 package on my computer and updated to the camera raw 5.4 version (bridge/photoshop/etc).
    All this is updated to last versions / updates etc.
    I upload my pictures from the camera to my computer.
    When I inspect the files with the Canon software such as "Digital Photo Professional 3.6.1.0" or "Canon Utilities ZoomBrowser EX 6.3.1" (this are the latest versions and updates)  the files are copied correct to my computer.
    The files can be opened with the canon software without any problem and can be modified and converted and all normal operations can be performed.
    The files can also be converted to ".jpg" and no problem occurs.
    The problem starts after the folder with the pictures (in RAW-format) has been opened / viewed in Adobe Bridge or after Adobe bridge is started.
    After starting and viewing with bridge the folder only some of the file become corrupted and cannot be used anymore.So the majority of the files remain OK but some files become corrupt.
    A part of the picture is visible partly, the remaining part are lines, dots, squares etc and the divisions of correct picture and corrupted seems to be at random. There is no structure in the way the corrupted file are shown in bridge. Please look at the enclosed picture in this message.
    The files can be opened in Camera Raw and Photoshop but they show the corrupted file and can be stored etc.
    But the picture is destroyed.
    Does anybody have a sollution or similar problems?
    Best regards

    There are forums for Bridge, Camera RAW and Photoshop.
    Any of them would be better than posting here.
    Bob

  • Is there a PS setting where it can read my Canon Mark 3 5D's monochrome setting so that the files will also appear in monochrome instead of colour? Or do I have to change from RAW each time to Monochrome within PS (even if I have set up for Monochrome in

    Is there a PS setting where it can read my Canon Mark 3 5D's monochrome setting so that the files will also appear in the same monochrome structure (instead of appearing in colour as it is now when i open in PS, not even monochrome)? Or do I have to change from RAW each time to Monochrome within PS (even if I have set up for Monochrome in my 5D)?
    1. Basically I am taking photos with a monochrome setting with my 5D and the screen shows monochrome.
    2. when i import this onto my computer and open with PS, it opens in colour.
    3. when I change this to Monochrome in PS. (It cannot read my monochrome setting with my preference of the greyscale mix when I viewed it on my 5D)
    4. my question is: is there a way so that PS can read the monochrome setting from my 5D and display the same monochrome setting as my 5D instead of a different one from PS.
    Lily

    you need camera raw 6.7 or better.
    i don't know what version of ps you have so i can't say if it can be updated to use that, but update anyway.
    pre cc updates:  http://www.adobe.com/downloads/updates/
    cc updates:  http://prodesigntools.com/adobe-cc-updates-direct-links-windows.html
    cc 2104 updates:  http://prodesigntools.com/adobe-cc-2014-updates-links-windows.html
    if you can't update your cr sufficiently, use the dng converter. DNG Converters:
    Win
    Mac

Maybe you are looking for

  • Can't find my podcast/Editing Podcast

    I recieved the email telling me that apple accepted my podcast and that it'd be posted in a few hours and it contained a link to click to see when the podcast was up and running. I clicked the link and podcast was in the store and sent the link to so

  • Hiding _System in file manager

    Hey guys, With ModuleTemplates Showing up recently under _System for some reason (Scott spotted this first) - This folder does not show up in the file manager. Campaigns do not but content holders do (but no files) but Apps folder does and any other

  • Can anyone explain why iPhone/iPad show one location and MacBook another?

    Hi.  When I'm sitting at home with my three Apple devices I've noticed that my MacBook shows my correct location (or at least within about 10 metres) on Google Maps and Find my Iphone.  My iPhone 4S and iPad 2 (wifi-only model) both show exactly the

  • Sort Datetime and Varchar Challenge

    Hi! I want to sort datetime (converted to varchar) and varchar together , here is test data generation sql. SELECT CONVERT(VARCHAR(10),GETDATE()-3,111) AS TEST_DATE INTO #TMP_TEST INSERT INTO #TMP_TEST SELECT CONVERT(VARCHAR(10),GETDATE()-5,111) INSE

  • Captivate 4 - Automaticaly stop recording ?!

    Hello When I produce a demonstration everything is fine except when I have to fill in a text box. When I click in it software automatically stops recording. Why? What should I do?