Exception: "Failed to save MapiAclTable table" using Exfolder.exe

To Forum,
I currently have a user who is trying to share their Calendar to another fellow
employee and is getting a "The modified permissions cannot be saved."
within outlook 2010. I was able to replicate the issue on another outlook 2010
client. Did some research and discovered the Exfolder.exe tool for exchange
2010 and connect to database where the user resides and open calendar folder
permission and the default and anonymous were listed but they are not listed in
the user's outlook calendar permissions. I than proceeded to clear the ACLS and
add another exchange user and received the "Failed to save MapiAclTable
table. (hoping the default and anonymous would re-populate to user's calendar
permissions)<o:p></o:p>
The only information on google relating to this problem appears to be with exchange
public folders not a user's calendar ACL or DACL permissions. <o:p></o:p>
Nick

Hi,
Glad to know that you have found the solution. Thanks for your generous sharing : )
Thanks
Mavis
Mavis Huang
TechNet Community Support

Similar Messages

  • Cannot join tables used in the workbook. Join "" not found in the EUL. Atte

    Trying to migrate workbooks from one instance to other using discoverer 10.1.2.48.18 i am getting the following error
    Unable to open the worksheet requested.
    - OracleBI Discoverer was unable to find the worksheet that was requested.
    - BIB-10310 An unknown exception occurred.
    - Cannot join tables used in the workbook. Join "" not found in the EUL. Attempt to open workbook failed.
    but its fine one instance , and in prod i am getting above weird error.
    does anybody know how to rectify this.

    Note:267476.1 - for those who can't get into metalink but still need to know the answer
    Subject: Discoverer Workbook Fails With 'Cannot join tables.Item dependency "" not found in EUL' After EUL Migration, Importing Or Cloning
    Doc ID:      Note:267476.1      Type:      PROBLEM
    Last Revision Date:      28-APR-2005      Status:      PUBLISHED
    The information in this document applies to:
    Oracle Discoverer - Version: 4.1 to 10.1
    This problem can occur on any platform.
    Symptoms
    Opening a workbook migrated from one instance to another results in the following errors:
    * Unable to load EUL item ...
    * Cannot join tables. Item dependency "" not found in EUL.
    * Illegal operation... DIS4USR OR DIS5USR caused an invalid page fault in module....
    If using the workbook dump utility, you may notice:
    * Found in EUL by id *** Name has changed to .. where the new name is really the wrong column
    It seems that items are matched by internal item identifiers (IDs) and not by developer keys and names.
    Changes
    Migrating from one environment to another. For example, you may have performed a database export / import of the EUL schema from DEV to TEST
    Cause
    The timestamp is the same for both EULs, what can be confirmed by the following SQL statement:
    * Discoverer 4:
    SQL> select ver_eul_timestamp from eul4_versions;
    * Discoverer 9.0.2 , 9.0.4, 10.1 the script is:
    SQL> select ver_eul_timestamp from eul5_versions;
    If the result of the sql output is the same in both EULs, it means that timestamp of both EULs are the same. Discoverer assumes that the EUL are exactly the same (e.g. this can happedn when one of them is created with export/import of the other EUL schema using database export/import utilities) and this is the reason it tries to match items by identifiers and not by developer keys or names.
    Solution
    Change the column ver_eul_timestamp in the eul4_versions (4.1) or eul5_versions(9.0.2 - 10.1)tables to be something different in the destination EUL. The format of this column is like year:month:day:seconds:milliseconds
    Example:
    * Discoverer 4.1:
    sql> update EUL4_VERSIONS set VER_EUL_TIMESTAMP =TO_CHAR(SYSDATE, 'YYYYMMDDHH24MISS');
    sql> commit;
    * Discoverer 9.0.2:
    SQL> update EUL5_VERSIONS set VER_EUL_TIMESTAMP = TO_CHAR(SYSDATE, 'YYYYMMDDHH24MISS');
    sql> commit;
    * Discoverer 9.0.4 -10.1:
    Run the script: EUL5_id.sql as the eulowner ID
    o For 9.0.4 -- This is found on a PC with the Discoverer Administrator installed in drive:\Oracle Home\discoverer\sql
    o For 10.1.0 -- This is found on a PC with the Discoverer Administrator installed in drive:\Oracle Home\discoverer\util
    This is documented in:
    Oracle® Business Intelligence Discoverer
    Administration Guide
    10g Release 2 (10.1.2.0.0)
    Part No. B13916-02
    http://download.oracle.com/docs/pdf/B13916_02.pdf
    Chapter 4 "About copying EULs and EUL objects by exporting and importing"
    Oracle® Discoverer Administrator
    Administration Guide
    10g (9.0.4)
    Part No. B10270-01
    http://download.oracle.com/docs/pdf/B10270_01.pdf
    Chapter 3 "About copying EULs and EUL objects by exporting and importing"
    References
    Bug 3202329 - Migrating Workbooks Between Instances Shows Incorrect Column

  • Photoshop CS6 keeps crashing. I am using a brand new IMAC, I have 16GB RAM recently updated as it was running slow with 8GB and lots of hardrive space. When it crashes, i lose all my presents and it changes my preferences, it also fails to save or produce

    Photoshop CS6 keeps crashing. I am using a brand new IMAC, I have 16GB RAM recently updated as it was running slow with 8GB and lots of hardrive space. When it crashes, i lose all my presents and it changes my preferences, it also fails to save or produce a recovered file despite settings asking it to do so every 5 mins now. When i try to report the problem, it says it has found an issue and gives me a link to a solution....but wont let me click on the link.... PLEASE HELP! It's killing my productivity.

    If the crash started right after you added RAM, it could be a bad Ram dimm. You can try this - change the amount of RAM in Photoshop to 4G, this should stop Photoshop from using the higher ram. If it doesn't crash, it's probably the ram. Or run a diagnostic app. You don't say when it crashes, what you're doing when it crashes - it there a pattern? Did it ever work on this computer? Did it work on your previous computer? Without a lot of data, it could be related to the new computer... What OS version? Did you recently update it? Have you updated Photoshop to the most recent update?
    Do you have any plug-ins loaded? Try restarting Photoshop, and immediately press & hold the shfit key to turn off plug-ins.

  • How do u save datas more than one table using net beans ide using JSF

    Hi,
    I am new to JSF.
    I save / delete / update / New master table using POJO (Plain Old Java Objects), database - oracle and Toplink Persistence Unit.
    How do u save data more than one table using net beans ide using JSF (I am using POJO) ?
    and also Tell me the reference book for JSF.
    Thanks in advance.
    regards,
    N.P.Siva

    SivaNellai wrote:
    I am new to JSF.
    So, I am using net beans IDE 6.1 from sun microsystem. It is a free software.No, you don't drag'n'drop if you're new to JSF. Switch to source code mode. Write code manually, with the help of IDE for the speed up.
    So, please guide me the reference books, articles. I need the basic understanding of JSF, net beans IDE.[JSF: The Complete Reference|http://www.amazon.com/JavaServer-Faces-Complete-Reference/dp/0072262400] is a good book. The [JSF specification document|http://jcp.org/aboutJava/communityprocess/final/jsr252/index.html] is also a good reading to understand what JSF is and how it works. There are also javadocs and tlddocs of Sun JSF Mojarra.

  • How to handle the failed records from the table when using DB Adapter

    Hi,
    I am reading some records from table using DB Adapter inside my synchronous BPEL process. Say like reading 100 records from table in between after successful reading of 90 records an error occured in 91st record due some various reasons(like DB down, Connection interrupted etc.). Then how to handle this situation, whether i have to read all the records from the begining and is there any option to continue from where it stopped reading.
    Can please anybody help me out in the regard?
    Thanks in advance
    Regards,
    Aejaz

    we had the same requirement some time ago and had two option:
    1. ask the R/3 development team add a deletion indicator in the table (and thus not actually deleting the record). this deletion indicator could then be used like for any other standard datasource
    this option was however refused, due to huge data volume after a while
    2. at the end of the load we copied the ZTABLE1 to ZTABLE2. then in the begin of the load (day after) we compare the data of table1 to table2. entries available in table2 but not in table1 are deleted, and we put a 'D'. in deletion indicator; as we only keep the deleted entries for one day, the volume of the new table is acceptable.
    M.

  • XML to Internal table using XSLT by CALL TRANSFORMATION error

    Dear experts,
    I have to fetch the data from an XML file using XSLT into internal tables. The XML file is very big as following:-
    <?xml version="1.0" standalone="yes" ?>
    - <Shipment>
      <shipmentID>25091203S000778</shipmentID>
      <manifestDateTime>2009-12-03T20:16:52.00</manifestDateTime>
      <shipmentFacilityNumber>025</shipmentFacilityNumber>
      <shipmentFacilityAbbreviation>CHI</shipmentFacilityAbbreviation>
      <shipmentFacilityAddress1>810 KIMBERLY DRIVE</shipmentFacilityAddress1>
      <shipmentFacilityAddress2 />
      <shipmentFacilityCity>CAROL STREAM</shipmentFacilityCity>
      <shipmentFacilityState>IL</shipmentFacilityState>
      <shipmentFacilityPostalCode>601880000</shipmentFacilityPostalCode>
      <shipmentTruckCarrierCode>X150</shipmentTruckCarrierCode>
      <shipmentSourceCode>T</shipmentSourceCode>
      <userID>CAMPOSG</userID>
    - <Delivery>
      <primaryCustomerNumber>954371</primaryCustomerNumber>
      <primaryCustomerName>MIDWEST OFFICE SUPPLY</primaryCustomerName>
      <primaryCustomerAddress1 />
      <primaryCustomerAddress2>4765 INDUSTRIAL DR</primaryCustomerAddress2>
      <primaryCustomerCity>SPRINGFIELD</primaryCustomerCity>
      <primaryCustomerState>IL</primaryCustomerState>
      <primaryCustomerPostalCode>627030000</primaryCustomerPostalCode>
      <primaryCustomerPhoneNumber>2177535555</primaryCustomerPhoneNumber>
      <shuttleStopFacilityNumber />
      <billOfLadingNumber>25HZK99</billOfLadingNumber>
      <carrierProNumber />
      <shipmentTotalCartonCount>6</shipmentTotalCartonCount>
      <shipmentTotalWeight>266</shipmentTotalWeight>
    - <order>
      <orderNumber>25HZK99</orderNumber>
      <subOrderNumber />
      <dateProcessed>2009-12-03</dateProcessed>
      <primaryOrderNumber />
      <shipTruckCode>X150</shipTruckCode>
      <shipTruckDescription>UDS - ADDISON</shipTruckDescription>
      <shipTruckPriorityCode>01</shipTruckPriorityCode>
      <shipTruckGroupCode>01</shipTruckGroupCode>
      <shipTruckDepartureTime>20.00.00</shipTruckDepartureTime>
      <shipTruckDockID>07</shipTruckDockID>
      <ldpFacilityAbbreviation />
      <shuttleAvailableIndicator>N</shuttleAvailableIndicator>
      <shuttleMessageText />
      <crossDockFacilityCode />
      <crossDockTruckCode />
      <crossDockID />
      <subsidizedFreightTruckID />
      <customerPurchaseOrderNumber>623559</customerPurchaseOrderNumber>
      <headerTypeCode>P</headerTypeCode>
      <orderTypeID>RG</orderTypeID>
      <deliveryTypeID>DS</deliveryTypeID>
      <deliveryMethodCode />
      <customerBarCode />
      <customerReferenceData>25HZK99</customerReferenceData>
      <customerReferenceText />
      <customerRouteData>ZNED UNTED</customerRouteData>
      <customerRouteText>ROUTE</customerRouteText>
      <endConsumerPurchaseOrderNumber />
      <endConsumerPurchaseOrderText />
      <endConsumerName>CHARLESTON TRANS. FACILITY</endConsumerName>
      <endConsumerAddress1>HOMEWOOD DT PROGRAM DEPT. 3</endConsumerAddress1>
      <endConsumerAddress2>17341 PALMER BLVD.</endConsumerAddress2>
      <endConsumerAddress3 />
      <endConsumerCity>HOMEWOOD</endConsumerCity>
      <endConsumerState>IL</endConsumerState>
      <endConsumerPostalCode>60430</endConsumerPostalCode>
      <endConsumerCountryCode />
      <fillFacilityNumber>025</fillFacilityNumber>
      <shpFacilityNumber>025</shpFacilityNumber>
      <homeFacilityAbbrCode>STL</homeFacilityAbbrCode>
      <homeFacilityNumber>015</homeFacilityNumber>
      <multiCartonIndicator>Y</multiCartonIndicator>
      <primaryCustomerIndicator>Y</primaryCustomerIndicator>
      <shipToCustomerNumber>954371001</shipToCustomerNumber>
      <customerCompanyID>01</customerCompanyID>
      <customerTruckID>U888</customerTruckID>
      <customerTruckDescription>UDS - ADDISON</customerTruckDescription>
      <customerTruckDockID>13</customerTruckDockID>
      <thirdPartyBillCarrier />
      <thirdPartyBillID />
      <thirdPartyBillType />
      <qualityCheckIndicator>N</qualityCheckIndicator>
      <warehouseLaydownID />
      <packListPosition>I</packListPosition>
      <preferredPackingType>CTN</preferredPackingType>
      <preferredPackingMaterial>PAPER</preferredPackingMaterial>
      <preferedPackingInstructions />
      <totalOrderCartonQty>6</totalOrderCartonQty>
      <convertAddressIndicator>N</convertAddressIndicator>
      <dealerInstructionIndicator>Y</dealerInstructionIndicator>
      <dealerinstructions1>CPO#: 623559</dealerinstructions1>
      <dealerinstructions2>ATTN: DANA GRIFFIN</dealerinstructions2>
      <dealerinstructions3>INFO: 612</dealerinstructions3>
      <dealerinstructions4>ROUTE: ZNED UNTED</dealerinstructions4>
      <dealerinstructions5 />
      <dealerinstructions6 />
      <shippingInstructionsIndicator>N</shippingInstructionsIndicator>
      <shippingInstructions1 />
      <shippingInstructions2 />
      <shippingInstructions3 />
      <shippingInstructions4 />
      <shippingInstructions5 />
      <shippingInstructions6 />
      <specialInstructionsIndicator>N</specialInstructionsIndicator>
      <specialInstructions1 />
      <specialInstructions2 />
      <customeContainerDesc />
    - <carton>
      <deliveryCartonID>253370905995</deliveryCartonID>
      <shipIndicator>Y</shipIndicator>
      <deliveryPalletID>X150</deliveryPalletID>
      <consolidatedDeliveryCartonID />
      <scanDateTime>2009-12-03T19:36:12.00</scanDateTime>
      <cartonWeight>52</cartonWeight>
      <dropShipFlag>1</dropShipFlag>
      <carrierTrackingNumber />
      <carrierZoneID>0</carrierZoneID>
      <codAmount />
      <customerPackageAmount />
      <declaredValue />
      <residentialDeliveryIndicator />
      <serviceTypeCode>00</serviceTypeCode>
      <ssccCode>006860244400829393</ssccCode>
    - <Item>
      <shipPrefix>UNV</shipPrefix>
      <shipStockNumber>21200</shipStockNumber>
      <itemDescription>PAPER XERO/DUP WE LTR 20#</itemDescription>
      <orderQuantity>1</orderQuantity>
      <originalShipQuantity>1</originalShipQuantity>
      <shipQuantity>1</shipQuantity>
      <inventoryUnitCode>CT</inventoryUnitCode>
      <inventoryWeightQuantity>52.000</inventoryWeightQuantity>
      <upcNumber>00000000000000</upcNumber>
      <upcRetailCode>087547212004</upcRetailCode>
      <hazmatIndicator>N</hazmatIndicator>
      <serialRequiredIndicator>N</serialRequiredIndicator>
      <dealerMemoPO>S</dealerMemoPO>
      <cartonLineNumber>1</cartonLineNumber>
      <orderLineNumber>11</orderLineNumber>
      <originalOrderPrefix>UNV</originalOrderPrefix>
      <originalOrderStockNumber>21200</originalOrderStockNumber>
      <reasonCode />
    - <Item_Serial>
      <serialNumber />
      </Item_Serial>
        </Item>
      </carton>
       </order>
      </Delivery>
      </Shipment>
    This is not the complete XML file as it exceeds the 15000 characters and then I cann't post here. So I have deleted much part of it.
    The hierarchy is as following: Shipment->Delivery->Order->Carton->Item.
    I have created a XSLT for it which is working fine.
    But when I execute my report program it gives CX_SY_XSLT_FORMAT_ERROR saying that
    Transformation error:  Non-canonical structure of element name XML_OUTPUT.

    Dear experts,
    My report program is as following:-
    *& Report  Z_ASNTRNS
    REPORT  Z_ASNTRNS.
    *& Report  Z_ASNTRNS
    TYPE-POOLS: abap, ixml.
    TABLES: ZASN_SHIPMENT,ZASN_DELIVERY,ZASN_ORDER,ZASN_CARTON,ZASN_ITEM.
    *CONSTANTS gs_file TYPE string VALUE 'C:Documents and SettingsC5134126DesktopRajesh_kandakatlaSampleASNFile.xml'.
    This is the structure for the data from the XML file
    TYPES: BEGIN OF ts_item,
          ZSHIPMENT LIKE ZASN_ITEM-ZSHIPMENT,
          VBELN LIKE ZASN_ITEM-VBELN,
          ORDER_NUMBER LIKE ZASN_ITEM-ORDER_NUMBER,
          CARTON_ID LIKE ZASN_ITEM-CARTON_ID,
           ITEM LIKE ZASN_ITEM-ITEM,
           CARTON_LINE_NUM LIKE ZASN_ITEM-CARTON_LINE_NUM,
           CARTON_LINE_NUMBER LIKE ZASN_ITEM-CARTON_LINE_NUM,
          AEDAT(8),
          AEZET(6),
           ITEM_DESCRIPTION LIKE ZASN_ITEM-ITEM_DESCRIPTION,
           ORD_QTY(16),
           ORIGINAL_SHIP(16),
           SHIP_QTY(16),
           UPC_NUMBER LIKE ZASN_ITEM-UPC_NUMBER,
           DEALER_MEMO_PO(5),
           ORDER_LINE_NUM LIKE ZASN_ITEM-ORDER_LINE_NUM,
          STATUS LIKE ZASN_ITEM-STATUS,
           END OF ts_item.
    TYPES: BEGIN OF ts_carton,
          ZSHIPMENT LIKE ZASN_CARTON-ZSHIPMENT,
          VBELN LIKE ZASN_CARTON-VBELN,
          ORDER_NUMBER LIKE ZASN_CARTON-ORDER_NUMBER,
           CARTON_ID LIKE ZASN_CARTON-CARTON_ID,
          AEDAT(8),
          AEZET(6),
           SHIP_INDICATOR LIKE ZASN_CARTON-SHIP_INDICATOR,
           TRACKING_NUMBER LIKE ZASN_CARTON-TRACKING_NUMBER,
           ZZCARTON_WGT(18),
           Item type ts_item,
           END OF ts_carton.
    TYPES: BEGIN OF ts_order,
          ZSHIPMENT LIKE ZASN_ORDER-ZSHIPMENT,
          VBELN LIKE ZASN_ORDER-VBELN,
           ORDER_NUMBER LIKE ZASN_ORDER-ORDER_NUMBER,
          AEDAT(8),
          AEZET(6),
           SUB_ORDER LIKE ZASN_ORDER-SUB_ORDER,
           ORDER_DATE(8),
           PRIMARY_ORDER LIKE ZASN_ORDER-PRIMARY_ORDER,
           CUSTOMER_PO LIKE ZASN_ORDER-CUSTOMER_PO,
           PRIMARY_ID LIKE ZASN_ORDER-PRIMARY_ID,
           SHIP_TO LIKE ZASN_ORDER-SHIP_TO,
          ANZPK(5),
           carton type ts_carton,
           END OF ts_order.
    TYPES: BEGIN OF ts_delivery,
          ZSHIPMENT LIKE ZASN_DELIVERY-ZSHIPMENT,
          VBELN LIKE ZASN_DELIVERY-VBELN,
          AEDAT(8) TYPE C,
          AEZET(6) TYPE C,
           PRIMARY_CUSTOMER LIKE ZASN_DELIVERY-PRIMARY_CUSTOMER,
           BILL_OF_LADING LIKE ZASN_DELIVERY-BILL_OF_LADING,
           CARTON_COUNT(5),
           TOTAL_WEIGHT(18),
           order type ts_order,
           END OF ts_delivery.
    TYPES: BEGIN OF ts_shipment,
           ZSHIPMENT LIKE ZASN_SHIPMENT-ZSHIPMENT,
           MANIFEST_DATE_TIME(25),
          AEDAT(8) TYPE C,
          AEZET(6) TYPE C,
          SDATE(8) TYPE C,
          STIME(6) TYPE C,
           SFACILITY_NUMBER LIKE ZASN_SHIPMENT-SFACILITY_NUMBER,
           ZZCARRIERCODE LIKE ZASN_SHIPMENT-ZZCARRIERCODE,
           Delivery type ts_delivery,
           END OF ts_shipment.
    TYPES: BEGIN OF ts_shipment1,
           ZSHIPMENT LIKE ZASN_SHIPMENT-ZSHIPMENT,
           MANIFEST_DATE_TIME(25),
           SFACILITY_NUMBER LIKE ZASN_SHIPMENT-SFACILITY_NUMBER,
           ZZCARRIERCODE LIKE ZASN_SHIPMENT-ZZCARRIERCODE,
           PRIMARY_CUSTOMER LIKE ZASN_DELIVERY-PRIMARY_CUSTOMER,
           BILL_OF_LADING LIKE ZASN_DELIVERY-BILL_OF_LADING,
           CARTON_COUNT(5),
           TOTAL_WEIGHT(18),
           ORDER_NUMBER LIKE ZASN_ORDER-ORDER_NUMBER,
           SUB_ORDER LIKE ZASN_ORDER-SUB_ORDER,
           ORDER_DATE(8),
           PRIMARY_ORDER LIKE ZASN_ORDER-PRIMARY_ORDER,
           CUSTOMER_PO LIKE ZASN_ORDER-CUSTOMER_PO,
           PRIMARY_ID LIKE ZASN_ORDER-PRIMARY_ID,
           SHIP_TO LIKE ZASN_ORDER-SHIP_TO,
           CARTON_ID LIKE ZASN_CARTON-CARTON_ID,
           SHIP_INDICATOR LIKE ZASN_CARTON-SHIP_INDICATOR,
           TRACKING_NUMBER LIKE ZASN_CARTON-TRACKING_NUMBER,
           ZZCARTON_WGT(18),
           ITEM LIKE ZASN_ITEM-ITEM,
           CARTON_LINE_NUM LIKE ZASN_ITEM-CARTON_LINE_NUM,
           CARTON_LINE_NUMBER LIKE ZASN_ITEM-CARTON_LINE_NUM,
           ITEM_DESCRIPTION LIKE ZASN_ITEM-ITEM_DESCRIPTION,
           ORD_QTY(16),
           ORIGINAL_SHIP(16),
           SHIP_QTY(16),
           UPC_NUMBER LIKE ZASN_ITEM-UPC_NUMBER,
           DEALER_MEMO_PO(5),
           ORDER_LINE_NUM LIKE ZASN_ITEM-ORDER_LINE_NUM,
           END OF ts_shipment1.
    TYPES: BEGIN OF t_xml_line,
                 data(256) TYPE x,
               END OF t_xml_line.
    *Typdefinition für Airplus
    *READ THE DOCUMENTATION "LASG_XML_INVOICE_BTM"!!!
    VARs beginning with "a_" are ATTRIBUTES
    DATA: l_ixml            TYPE REF TO if_ixml,
          l_streamfactory   TYPE REF TO if_ixml_stream_factory,
          l_parser          TYPE REF TO if_ixml_parser,
          l_istream         TYPE REF TO if_ixml_istream,
          l_ostream         TYPE REF TO if_ixml_ostream,
          l_document        TYPE REF TO if_ixml_document,
          l_node            TYPE REF TO if_ixml_node,
          l_xml TYPE REF TO cl_xml_document,
          l_xmldata         TYPE string.
    DATA: l_xml_table       TYPE TABLE OF t_xml_line,
             l_xml_line        TYPE t_xml_line,
             l_xml_table_size  TYPE i.
    DATA: l_filename        TYPE string.
    DATA: xml_out TYPE string ,
          size type i.
    DATA: l_xml_x1   TYPE xstring.
    DATA: l_len      TYPE i,
              l_len2     TYPE i,
              l_tab      TYPE tsfixml,
              l_content  TYPE string,
              l_str1     TYPE string,
              c_conv     TYPE REF TO cl_abap_conv_in_ce.
             l_itab     TYPE TABLE OF string.
    DATA: BEGIN OF l_itab occurs 0,
          data(256) type c,
          end of l_itab.
    TYPES : BEGIN OF TY_TEXT,
              data(255) type C,
            END OF TY_TEXT.
    DATA: F_XML TYPE STRING.
    DATA : LT_TEXT_OUT type table of TY_TEXT with header line.
    tables
    DATA: it_shipment    TYPE STANDARD TABLE OF ts_shipment,
          wa_shipment    TYPE  ts_shipment.
    *Errorvariables
    DATA: xslt_err   TYPE REF TO cx_xslt_exception,
          err_string TYPE string.
    PARAMETERS: pa_file TYPE localfile OBLIGATORY
    DEFAULT 'C:Documents and SettingsC5134126DesktopRajesh_kandakatlaSampleASNFile.xml'.
    START-OF-SELECTION.
      Creating the main iXML factory
        l_ixml = cl_ixml=>create( ).
      Creating a stream factory
        l_streamfactory = l_ixml->create_stream_factory( ).
        PERFORM get_xml_table CHANGING l_xml_table_size l_xml_table.
    here we use the CALL TRANSFORMATION method which calls
    the XSLT program "z_asnfile"
    TRY.
        CALL TRANSFORMATION ('Z_ASNFILE')
          SOURCE xml LT_TEXT_OUT[]
          RESULT xml_output = it_shipment
    catch any error, very helpful if the XSLT isn't correct
        CATCH cx_xslt_exception INTO xslt_err.
        err_string = xslt_err->get_text( ).
        WRITE: / 'Transformation error: ', err_string.
        EXIT.
      ENDTRY." setting a breakpoint to watch the workarea
    by the internal table "it_airplus"
           break-point.
      LOOP AT it_shipment INTO wa_shipment.
      ENDLOOP.
    *&      Form  get_xml_table
      FORM get_xml_table CHANGING l_xml_table_size TYPE i
                                  l_xml_table      TYPE STANDARD TABLE.
        l_filename = pa_file.
      upload a file from the client's workstation
        CALL METHOD cl_gui_frontend_services=>gui_upload
          EXPORTING
            filename   = l_filename
            filetype   = 'BIN'
          IMPORTING
            filelength = l_xml_table_size
          CHANGING
            data_tab   = l_xml_table
          EXCEPTIONS
            OTHERS     = 19.
        IF sy-subrc <> 0.
          MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
                     WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
        ENDIF.
    Convert binary to text.
    CALL FUNCTION 'SCMS_BINARY_TO_TEXT'
      EXPORTING
        INPUT_LENGTH          = 70000
                FIRST_LINE            = 0
                LAST_LINE             = 0
                APPEND_TO_TABLE       = ' '
                MIMETYPE              = ' '
        WRAP_LINES            = 'X'
              IMPORTING
                OUTPUT_LENGTH         =
      TABLES
        BINARY_TAB            = l_xml_table
        TEXT_TAB              = LT_TEXT_OUT
              EXCEPTIONS
                FAILED                = 1
                OTHERS                = 2
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    ENDFORM.                    "get_xml_table

  • Help with uploading an excel file to a table using an application

    Hello,
    Can anyone please help me out with this issue. I have apex application where in the end users upload an excel file to a table. For this I have followed the solution provided in this link
    http://avdeo.com/2008/05/21/uploading-excel-sheet-using-oracle-application-express-apex/
    Using the above solution, I was able to upload the excel data to a table "sample_tbl1" successfully with fields Id,acct_no,owner_name,process_dt. But the thing is I want accomdate a particular condition while uploading the file data, to check see if the acct_no already exists in another table say "sample_tbl2" or not. If acct_nos already exists in sample_tbl2 then give out an error displaying the list of account numbers that already exists in the database. Below is the code which I am using to upload file data to a table.
    DECLARE
    v_blob_data       BLOB; 
    v_blob_len        NUMBER; 
    v_position        NUMBER; 
    v_raw_chunk       RAW(10000); 
    v_char            CHAR(1); 
    c_chunk_len       number       := 1; 
    v_line            VARCHAR2 (32767)        := NULL; 
    v_data_array      wwv_flow_global.vc_arr2; 
    v_rows            number; 
    v_sr_no           number         := 1; 
    l_cnt             BINARY_INTEGER := 0;
    l_stepid          NUMBER := 10;
    BEGIN
    --Read data from wwv_flow_files</span> 
    select blob_content into v_blob_data 
    from wwv_flow_files 
    where last_updated = (select max(last_updated) from wwv_flow_files where UPDATED_BY = :APP_USER) 
    and id = (select max(id) from wwv_flow_files where updated_by = :APP_USER); 
    v_blob_len := dbms_lob.getlength(v_blob_data); 
    v_position := 1; 
    /* Evaluate and skip first line of data
    WHILE (v_position <= v_blob_len ) LOOP
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
    v_char :=  chr(hex_to_decimal(rawtohex(v_raw_chunk)));
    v_line := v_line || v_char;
    v_position := v_position + c_chunk_len;
    -- When a whole line is retrieved
    IF v_char = CHR(10) THEN
    -- Clear out 
    v_line := NULL;
    EXIT;
    END IF;
    END LOOP;
    -- Read and convert binary to char</span> 
    WHILE ( v_position <= v_blob_len ) LOOP 
    v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position); 
    v_char :=  chr(hex_to_decimal(rawtohex(v_raw_chunk))); 
    v_line := v_line || v_char; 
    v_position := v_position + c_chunk_len; 
    -- When a whole line is retrieved </span> 
    IF v_char = CHR(10) THEN
    -- Convert comma to : to use wwv_flow_utilities </span> 
    v_line := REPLACE (v_line, ',', ':'); 
    -- Convert each column separated by : into array of data </span> 
    v_data_array := wwv_flow_utilities.string_to_table (v_line); 
    -- Insert data into target table
    EXECUTE IMMEDIATE 'insert into sample_tbl1(ID,ACCT_NO,OWNER_NAME,PROCESS_DT) 
    values (:1,:2,:3,:4)'
    USING 
    v_sr_no, 
    v_data_array(1), 
    v_data_array(2),
    to_date(v_data_array(3),'MM/DD/YYYY');
    -- Clear out 
    v_line := NULL; 
    v_sr_no := v_sr_no + 1; 
    l_cnt := l_cnt + SQL%ROWCOUNT;
    END IF; 
    END LOOP;
    delete from wwv_flow_files
    where last_updated = (select max(last_updated) from wwv_flow_files where UPDATED_BY = :APP_USER)
    and id = (select max(id) from wwv_flow_files where updated_by = :APP_USER);
    l_stepid  := 20;
    IF l_cnt = 0 THEN
    apex_application.g_print_success_message := apex_application.g_print_success_message || '<p><span style="font-size:14;font-weight:bold">Please select a file to upload.</span></p>' ;
    ELSE
    apex_application.g_print_success_message := apex_application.g_print_success_message || '<p><span style="font-size:14;font-weight:bold;color:green">File uploaded and processed ' || l_cnt || ' record(s) successfully.</span></p>';
    END IF;
    l_stepid  := 30;
    EXCEPTION WHEN OTHERS THEN
    ROLLBACK;
    apex_application.g_print_success_message := apex_application.g_print_success_message || '<p><span style="font-size:14;font-weight:bold;color:red">Failed to upload the file. '||REGEXP_REPLACE(SQLERRM,'[('')(<)(>)(,)(;)(:)(")('')]{1,}', '') ||'</span></p>';
    END;
    {code}
    Can anyone please help me, how do i accomdate the condition within my existing code.
    thanks,
    Orton                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

    Orton,
    From your code it appears that the account No comes in the second column of the file = > v_data_array(1)
    So You can put a conditional block around the execute immediate code that inserts the records
    For instance
      SELECT count(1) INTO ln_account_no_exists from <"sample_tbl2> where account_no = v_data_array(1);
      IF (  ln_account_no_exists > 0 ) THEN
         --Account No: already exists
        <Do what you want to do here >
      ELSE
        EXECUTE IMMEDIATE ...
      END IF:
    {code}
    Inorder to handle the account no records which exists you can
      <li>Raise an exception
      <li> Write record to table or insert into collection and then use a report region in the page based on this table/collection to show error records
      <li> Append errored account No:s to the Success Message Variable programmatically(this variable is used by PLSQL process success/error message )
       {code}
        IF ( record exists)
          apex_application.g_print_success_message := apex_application.g_print_success_message||','|| v_data_array(1) ; -- Comma separated list of errored account no:s
        ELSE ...
       {code}
    Hope it helps                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Creating database and tables using datasources in a portal application

    Hi All,
    <b>
    I have created a datasource using Visual Administrator.
    Now I want to create a database and then tables using this datasource.
    I have todo this in the portal application.
    How to create a database and then tables into this database????
    </b>
    regards
    Brahmachaitanya

    Hi,
    I have created a datasource using Visual Administrator. I am trying to connect to it in my portal application. I have written the following code...
    InitialContext iContext = new InitialContext();
    DataSource ds = (DataSource) iContext.lookup("jdbc/CustomDataSource");
    Connection connection = ds.getConnection();
    But I am getting the following connection exception....
    ResourceException in method ConnectionFactoryImpl.getConnection(): com.sap.engine.services.dbpool.exceptions.BaseResourceException: SQLException thrown by the physical connection: com.microsoft.sqlserver.jdbc.SQLServerException: Login failed for user 'sysadmin'.
    can anybody tell me how to go about this exception?????
    regards
    Brahmachaitanya

  • How to Update  crmd_customer_h TABLE Using CRMV_EVENT Through Funtion Module

    Hi
    How we can update customer_h table using the CRMV_EVENT Where i implemented logic below in the Funtion Module.
    data:     lt_doc_flow          TYPE crmt_doc_flow_wrkt,
              lw_cust_h_com        TYPE crmt_customer_h_com,
              lw_input_field_names TYPE crmt_input_field_names,
              lt_input_field_names TYPE crmt_input_field_names_tab,
              lt_objects_to_save TYPE crmt_object_guid_tab,
              lw_guid TYPE CRMT_OBJECT_GUID.
    DATA : lv_process_type TYPE crmt_process_type.
    data: wa_doc_flow type CRMT_DOC_FLOW_WRK.
    data: wa_customer_h type crmd_customer_h.
    *  Function module for retriving the Process type.
      CALL FUNCTION 'CRM_ORDERADM_H_READ_OW'
        EXPORTING
          iv_orderadm_h_guid     = iv_header_guid
        IMPORTING
          ev_process_type        = lv_process_type
        EXCEPTIONS
          admin_header_not_found = 1
          OTHERS                 = 2.
    if lv_process_type eq 'ZG01'.
    CALL FUNCTION 'CRM_DOC_FLOW_READ_OB'
    EXPORTING
       IV_HEADER_GUID                 = iv_header_guid
    IMPORTING
       ET_DOC_FLOW_WRK                = lt_doc_flow.
    read table lt_doc_flow with key objtype_a = 'BUS2000116' INTO wa_doc_flow. "gc_object_type-service.
                if sy-subrc eq 0.                    "set flag for service order
                lw_cust_h_com-ref_guid       =  wa_doc_flow-objkey_a.
                lw_cust_h_com-ZZTRAIL_FLAG   = 'X'.
                 lw_cust_h_com-mode           = 'A'.
                lw_cust_h_com-ref_handle     = '0000000001'.
                lw_guid = wa_doc_flow-objkey_a.
                INSERT lw_guid INTO TABLE lt_objects_to_save.
         endif.
                  lw_input_field_names-fieldname = 'REF_GUID'.
                lw_input_field_names-fieldname = 'ZZTRAIL_FLAG'.
              lw_input_field_names-changeable = ' '.
               INSERT lw_input_field_names INTO TABLE lt_input_field_names.
    Maintain Customer H
             CALL FUNCTION 'CRM_CUSTOMER_H_MAINTAIN_OW'
                  EXPORTING
                    is_customer_h_com    = lw_cust_h_com
                  CHANGING
                    ct_input_field_names = lt_input_field_names
                  EXCEPTIONS
                    header_change_error  = 1
                    header_create_error  = 2
                   error_occurred       = 3.
    ENDIF.
    *endif.
    *Clearing local variables
      clear: lv_process_type,
             lw_cust_h_com,
             lw_input_field_names.
    *Free internal tables
      free: lt_doc_flow,
            lt_input_field_names.

    Hi Faisal
    I think your not clear with what i am saying anyhow i will again explain you my requirement
    As per my requirement
    1)in the service order search report i need to add a field called "Has trail order with No Follow up" with values "Yes" & "Blank"
    For above Field i added  using the structure CRMST_QUERY_SRVO_BTIL and through configuration i am able to display the field in webui as per (Attachement Pic 1)
    2)When i  search with search criteria as  "Has trail order with No Follow up" with  "Yes"
    Then in result list i need to show the service order those having follow up as trail orders(sales order) only.if for  next document trail order  having any follow up then those service orders dont want to show in result list.
    For above requirement i implemented F.M using CRMV_EVENT & I configured for BUS2000115 And BEFORE_SAVE The Order
    The FM Will get trigger when i save the service order and for that service order if create any follow up and try to save the trail order then This FM Will trigger and in this i am doing validations.
    3)Add one AET Trail Flag field is added under CUSTOMER_H Table.
    4)in the FM I am validating for if the trail order having the preceding document as service order then i need to make flag as "X" For that service order in customer_h
    if suppose when i delete trail order from the service order then that flag must need to be "unset" from the CUSTOMER_H.
    Why bcoz we are doing above process is do show records in result list based on Flag values
    these flag checks are validating in the BADI Which we implemented for search logic.
    Please refer below Login for my requirement:-
    Proposal to have a custom “flag” field (background at table level,
         crmd_customer_h) linked to service order which gets flagged whenever at
         least one Trial order is created and saved from the Service Order.
    The flag value should be cleared when all the trial orders created and
    saved as follow up transactions are deleted from the system.
    Similarly for Trial Orders will use the same custom “flag” field
         which gets activated when at least one follow up is created and saved from Trial Order.
    The flag value should be cleared when all the follow up transactions from
    the Trial Order are deleted from the system.
    When the above search criteria “Has Trial order with no follow up”
         “is” “Yes” is applied then the logic derives all the service
         orders which satisfy additional search criteria applied in the search and
         for these Service orders checks if the custom flag field is checked to
         derive all Service orders which have Trial order. The custom flag values
         values are derived from crmd_customer_h table in CRM.
    4 )Further for all the Trial Orders determined in Step 3
    check if the Trial Order has a follow up by checking if the custom flag field
    is checked. The custom flag values are derived from crmd_customer_h table in
    CRM.
    5) If step 4 is not met populate the preceding Service
    Orders in the Result list

  • Error : External Content Type Failed to SAVE

    Hello All,
    Im trying get connect the sqlserver tablt to Sharepoint List using BCS services + Sharepoint Designer.
    i have created the external content type using SPD, creating all permission to table, operation properties, parameter, filter parameter every thing is fine.
    then here i have to SAVE the external content type, im getting error: External Content Type Failed to SAVE
    detailed Error:
    An unexcepted internal error occured in the Business Data Connectivity Sahred Service: Access to the temp directory is denied. identity 'NTAUTHORITY\NETWORK SERVICE' under which xmlserializer is running does not have sufficient permissions to access the
    temp directory. CodeDom will use the user account to process is using to do the compilation, so if the user doesnot have access to system temp directory, you will not to be able to compile. Use Path.GetTempPath() API to find out the temp directory location
    Before this i created sample external content type which is working correctly. Any suggestion regarding this..
    Advance thanks..
    NS

    Hi,
    I understand that when you save the external content type, there is an error.I have seen similar errors, you can try the steps below:
    Open C::\Windows\Temp.
    Right click on Temp folder and choose Properties>Security.
    Add the Network Service with Full Control permisison.
    Thanks,
    Entan Ming
    Entan Ming
    TechNet Community Support

  • Quering MySql DB tables using DB adapter in Oracle SOA

    Hi',
    I am trying to query MySql DB tables using DB adapter in Oracle SOA.
    inside the weblogic server console I have created a data source and test is "success" MySql
    Now inside the deployments > DBAdapter > Configuration > outbound connection pools
    I am new outbound connection, however I only get one option by default "javax.resource.cci.ConnectionFactory".
    which is default pointing to oracle DB.
    If someone has implemented this earlier or some blog/document demonstrates this please let me know.
    Thanks
    Yatan

    Thanks Naresh,
    actually I tried changing the "platformClassName" to "oracle.toplink.platform.database.MySQL4Platform" in the existing connection factory.
    I am able to update the DBAdapter successfully, however when I try to run the service created in JDEV which uses this MySql DB table it gives me below error.
    I also have another doubt regarding "How can we create a new connection factory in weblogic console?"
    ERROR:
    The selected operation process could not be invoked.
    An exception occured while invoking the webservice operation. Please see logs for more details.
    oracle.sysman.emSDK.webservices.wsdlapi.SoapTestException: Exception occured when binding was invoked.
    Exception occured during invocation of JCA binding: "JCA Binding execute of Reference operation 'insert' failed due to: Could not create/access the TopLink Session.
    This session is used to connect to the datastore.
    Caused by Exception [EclipseLink-7042] (Eclipse Persistence Services - 2.1.3.v20110304-r9073): org.eclipse.persistence.exceptions.ValidationException
    *Exception Description: Database platform class [org.eclipse.persistence.platform.database.MySQL4Platform] not found.*
    Internal Exception: Exception [EclipseLink-3007] (Eclipse Persistence Services - 2.1.3.v20110304-r9073): org.eclipse.persistence.exceptions.ConversionException
    Exception Description: The object [org.eclipse.persistence.platform.database.MySQL4Platform], of class [class java.lang.String], could not be converted to [class java.lang.Class]. Ensure that the class [org.eclipse.persistence.platform.database.MySQL4Platform] is on the CLASSPATH. You may need to use alternate API passing in the appropriate class loader as required, or setting it on the default ConversionManager
    Internal Exception: oracle.classloader.util.AnnotatedClassNotFoundException:
    Missing class: org.eclipse.persistence.platform.database.MySQL4Platform
    Dependent class: org.eclipse.persistence.internal.helper.ConversionManager
    Loader: sun.misc.Launcher$AppClassLoader@13288040
    Code-Source: /C:/Oracle/MiddlewarePS4/modules/org.eclipse.persistence_1.1.0.0_2-1.jar
    Configuration: /C:/Oracle/MiddlewarePS4/modules/org.eclipse.persistence_1.1.0.0_2-1.jar
    This load was initiated at default.composite.MySqlTest.soa_2c170363-5ad7-4578-8920-f30fb224a8d2:1.0 using the Class.forName() method.
    You may need to configure the connection settings in the deployment descriptor (i.e. DbAdapter.rar#META-INF/weblogic-ra.xml) and restart the server. This exception is considered not retriable, likely due to a modelling mistake.
    The invoked JCA adapter raised a resource exception.
    Please examine the above error message carefully to determine a resolution.

  • Problem accessing custom Portal DB Table using default Datasource in VA

    Hello,
    I have a PAR application which tries to connect to a custom DB table defined inside the Portal DB Schema. The application uses a JNDI lookup to JDBC Connection service and uses the default DataSource (SAPJ2EDB/SAPSR3DB) defined in Visual Admin. However it gives the folowing exception while running a SQL query on the custom table,
    SQLException::The SQL statement "SELECT COUNT(*) "ROWCOUNT" FROM "<Z Table Name>"" contains the semantics error[s]:
    - 1:34 - the table or view >><Z Table Name><< does not exist.
    I can connect and execute a SQL using the same defualt DSN on any of the standard Portal Tables like, ume_strings
    Any ideas where its failing for the custom table??
    I can create a new DSN in Visual Admin but would like to use the default DSN privided by the J2EE engine.
    Thanks
    Sandip Agarwal

    Hi,
    So you can execute that exact SQL statement for example via the SQL Query tool? Does that Z table actually exist in the DB? Are you just using the table name or do you fully qualify it like master.ztable (e.g. are you telling it what db to use?).
    Hope this helps,
    Simon

  • Displaying table using call function 'REUSE_ALV_GRID_DISPLAY'

    I have created a table which has product code, product description, and product level.  I am trying to display it using REUSE_ALV_GRID_DISPLAY.  When I Check it, I get the following error message: "PVS2" is not an internal table - the "Occurs n" specification is missing.
    Is it possible to copy PVS2 into another table, and then display that table using REUSE_ALV_GRID_DISPLAY?
    I have patched together code from sdn, a client program, and my own code and I am starting to get confused.  So, please help me.
    Regards,
    Al Lal
    REPORT  YABHINAV16.
    * program to display products at chosen level *
    Tables: T179, T179t.
    types:  begin of hierarchy,
            prodh type t179-prodh,
            vtext type t179t-vtext,
            stufe type t179-stufe,
            end of hierarchy.
    types: begin of text,
    prodh type t179t-prodh,
    vtext type t179t-vtext,
    end of text.
    data: pvs type standard table of hierarchy initial size 0.
    data: pvs2 type hierarchy.
    data: it_text type standard table of text,
    wa_text type text.
    TYPE-POOLS:SLIS.
    *For ALV
    DATA: GT_FLD TYPE SLIS_T_FIELDCAT_ALV,
          GT_EV TYPE SLIS_T_EVENT,
          GT_HDR TYPE SLIS_T_LISTHEADER,
          GT_SORT TYPE SLIS_T_SORTINFO_ALV.
    DATA: WA_FLD TYPE SLIS_FIELDCAT_ALV,
          WA_EV TYPE SLIS_ALV_EVENT,
          WA_HDR TYPE SLIS_LISTHEADER,
          WA_SORT TYPE SLIS_SORTINFO_ALV,
          WA_LAYOUT TYPE SLIS_LAYOUT_ALV.
    DEFINE FLD.
      WA_FLD-FIELDNAME   = &1.
      WA_FLD-TABNAME     = &2.
      WA_FLD-OUTPUTLEN   = &3.
      WA_FLD-SELTEXT_L   = &4.
      WA_FLD-SELTEXT_M   = &5.
      WA_FLD-SELTEXT_S   = &6.
      WA_FLD-COL_POS     = &7.
      WA_FLD-FIX_COLUMN  = &8.
      WA_FLD-DO_SUM      = &9.
      APPEND WA_FLD TO GT_FLD.
      CLEAR WA_FLD.
    END-OF-DEFINITION.
    CONSTANTS: C_TOP TYPE SLIS_FORMNAME VALUE 'TOP_OF_PAGE',
               C_USER_COMMAND            TYPE  SLIS_FORMNAME  VALUE 'USER_COMMAND'.
    DATA: MTRL LIKE SY-REPID,
          TITLE LIKE SY-TITLE.
    select-options level for t179-stufe no intervals.
    start-of-selection.
    Select prodh stufe from T179 into corresponding fields of table pvs where stufe in level.
    select prodh vtext from t179t into corresponding fields of table it_text for all entries in pvs where prodh = pvs-prodh.
    end-of-selection.
    sort pvs by prodh.
    sort it_text by prodh.
    loop at pvs into pvs2.
      read table it_text into wa_text with key prodh = pvs2-prodh.
      if sy-subrc eq 0.
        pvs2-vtext = wa_text-vtext.
        write: / pvs2-prodh, pvs2-vtext, pvs2-stufe.
      endif.
    *  modify pvs2.
    endloop.
    perform BUILD_FIELDCAT.
    perform GRID_DISPLAY.
    form BUILD_FIELDCAT .
        FLD 'PRODH'   'PVS2'   '20'     'Product Hierarchy'        ' ' ' '  '1'  ''   '' .
        FLD 'VTEXT'   'PVS2'   '40'     'Description '        ' ' ' '  '3'  ''   '' .
        FLD 'STUFE'   'PVS2'    '5'    'Level'        ' ' ' '  '2'  ''   '' .
    endform.                    " BUILD_FIELDCAT
    form GRID_DISPLAY .
      call function 'REUSE_ALV_GRID_DISPLAY'
        EXPORTING
          I_CALLBACK_PROGRAM      = MTRL
          I_CALLBACK_USER_COMMAND = 'C_USER_COMMAND'
          I_CALLBACK_TOP_OF_PAGE  = C_TOP
          I_STRUCTURE_NAME        = 'PVS2'
          IS_LAYOUT               = WA_LAYOUT
          IT_FIELDCAT             = GT_FLD
          IT_SORT                 = GT_SORT
          I_DEFAULT               = 'X'
          I_SAVE                  = 'U'
          IT_EVENTS               = GT_EV
        TABLES
          T_OUTTAB                = PVS2[]
        EXCEPTIONS
          PROGRAM_ERROR           = 1
          others                  = 2.
      if SY-SUBRC <> 0.
        message id SY-MSGID type SY-MSGTY number SY-MSGNO
                with SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      endif.
    endform.                    " GRID_DISPLAY

    TYPE-POOLS : SLIS.
    DATA : BEGIN OF WA_T001,
                 BUKRS LIKE T001-BUKRS,
                 BUTXT LIKE T001-BUTXT,
                 ORT01 LIKE T001-ORT01,
                 END OF WA_T001,
                 IT_T001 LIKE TABLE OF WA_T001.
    DATA : IT_FCAT TYPE SLIS_T_FIELDCAT_ALV,
                WA_FCAT LIKE LINE OF IT_FCAT.
    DATA : V_NAME LIKE  SY-REPID.
    SELECT BUKRS BUTXT ORT01 FROM T001 INTO TABLE IT_T001 UP TO 15 ROWS. V_NAME = SY-REPID.
    CALL FUCTION MODULE 'REUSE_ALV_FIELDCATLOG_MERGE. EXPORTING  I_CALBACK_PROGRAM =
    V_NAME I_INTERAL_TABNAME = 'WA_T001' I_INCLNAME = V_NAME CHANGING CT_FIELDCAT =
    IT_FCAT.
    CALL FUNCTION MODULE "REUSE_ALV_GRID_DISPLAY"
    EXPORTING
    I_CALLBACK_PROGRAM = V_NAME
    IT_FCAT = IT_FCAT.
    TABLES
         T_OUTTAB  = IT_T001
    SY-REPID IS THE SYSTEM VARIABLE WHICH IS HAVING THE ABAP PROGRAM 
    OR CURRENT MAIN PROGRAM.
    ----- Sample Progam -
    ***INCLUDE YRVR058_DEST_WISE_SUMMARY_DF01 .
    *&      Form  DISPLAY_DATA
          text      *-- Rajesh Vasudeva
    -->  p1        text
    <--  p2        text
    FORM DISPLAY_DATA .
      IF ITAB[] IS NOT  INITIAL.
        PERFORM F_APPEND_BLOCK.
      ELSE.
        MESSAGE 'Data not found for the selection
    criteria' TYPE 'S'.
        LEAVE LIST-PROCESSING.
      ENDIF.
    ENDFORM.                    " display_data
    *&      Form  f_append_block
          text
    -->  p1        text
    <--  p2        text
    FORM F_APPEND_BLOCK .
      DATA : L_WA_SORT    TYPE SLIS_SORTINFO_ALV,   "For
    sort
             L_WA_EVENTS  TYPE SLIS_ALV_EVENT.      "For
    events
    Event (Top of List)
      CLEAR L_WA_EVENTS.
      L_WA_EVENTS-NAME = SLIS_EV_TOP_OF_LIST.
      L_WA_EVENTS-FORM = C_TOPOFPAGE.
      APPEND L_WA_EVENTS TO I_EVENTS_PART.
    Event (Top of Page)
      CLEAR L_WA_EVENTS.
      L_WA_EVENTS-NAME = SLIS_EV_TOP_OF_PAGE.
      L_WA_EVENTS-FORM = 'F_DISPLAY_HEADER_PARTA'(031).
      "f_display_header_part
      APPEND L_WA_EVENTS TO I_EVENTS_PART.
    Event (End of List)
      CLEAR L_WA_EVENTS.
      L_WA_EVENTS-NAME = SLIS_EV_END_OF_LIST.
      L_WA_EVENTS-FORM = C_END_OF_LIST.
      APPEND L_WA_EVENTS TO I_EVENTS_PART.
    Set Layout Zebra
      STRUCT_LAYOUT-ZEBRA          = 'X'.
      STRUCT_LAYOUT-NUMC_SUM       = 'X'.
      STRUCT_LAYOUT-TOTALS_TEXT    = 'TOTAL:'(032).
    set field catalog
      PERFORM F_FIELD_CATALOG_PART.
      ASSIGN ITAB[] TO <F_OUTTAB>.
      V_PART = 'A'.  "initiating list is A
      PERFORM F_DISPLAY_BLOCK USING STRUCT_LAYOUT
                                   I_FIELD_CAT_PART[]
                                   C_TAB
                                   I_EVENTS_PART[]
                                   I_SORT_PART[].
    ENDFORM.                    " f_append_block
    *&      Form  f_field_catalog_part
          text
    -->  p1        text
    <--  p2        text
    FORM F_FIELD_CATALOG_PART .
      REFRESH I_FIELD_CAT_PART.
      CLEAR I_FIELD_CAT_PART.
      PERFORM F_CREATE_CATALOG USING :
    *Month
    C_TAB 'MONTH'  'MONTH'      SPACE 'L' 7
    I_FIELD_CAT_PART[],
    *OBD
    *C_TAB 'VBELN'  'Delivery'      SPACE 'L' 12
    I_FIELD_CAT_PART[],
    *DATE
    C_TAB 'WADAT_IST'  'Date'      SPACE 'L' 10
    I_FIELD_CAT_PART[],
    *Destination
    C_TAB 'CITY1'  'Destination'      SPACE 'L' 25
    I_FIELD_CAT_PART[],
    *Qty By Road
    C_TAB 'NTGEW_ROAD'  'Road Quantity'   SPACE 'R' 16
    I_FIELD_CAT_PART[],
    *Rail Qty
    C_TAB 'NTGEW_RAIL'  'Rail Quantity'  SPACE 'R' 16 I_FIELD_CAT_PART[],
    *Total  Qty C_TAB 'TOT'  'Total Quantity'  SPACE 'R' 16 I_FIELD_CAT_PART[], *RR/Trk No.
      C_TAB 'EXTI2'  'Truck/RR No.' SPACE 'L' 17 I_FIELD_CAT_PART[].
    ENDFORM.                    " f_field_catalog_part
    *&      Form  f_DISPLAY_block
          text
         -->P_STRUCT_LAYOUT  text
         -->P_I_FIELD_CAT_PART[]  text
         -->P_C_TAB  text
         -->P_I_EVENTS_PART[]  text
         -->P_I_SORT_PART[]  text
    FORM F_DISPLAY_BLOCK  USING  FP_LAYOUT         TYPE
    SLIS_LAYOUT_ALV
                                 FP_I_FCAT         TYPE
    SLIS_T_FIELDCAT_ALV
                                 VALUE(FP_TABNAME) TYPE
    ANY
                                 FP_I_EVENTS       TYPE
    SLIS_T_EVENT
                                 FP_I_SORT         TYPE
    SLIS_T_SORTINFO_ALV.
      DATA: V_REPID  TYPE SYREPID,                 
    "current Program id
            C_SAVE       TYPE CHAR1 VALUE 'A'.     
    "variant save
      V_REPID = SY-REPID.
      CALL FUNCTION 'REUSE_ALV_LIST_DISPLAY'
    *CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'
        EXPORTING
          I_CALLBACK_PROGRAM = V_REPID
          IS_LAYOUT          = FP_LAYOUT
          IT_FIELDCAT        = FP_I_FCAT[]
          IT_SORT            = FP_I_SORT[]
          I_SAVE             = C_SAVE             "variant
    save
          IT_EVENTS          = FP_I_EVENTS[]
        TABLES
          T_OUTTAB           = <F_OUTTAB>
        EXCEPTIONS
          PROGRAM_ERROR      = 1
          OTHERS             = 2.
      IF SY-SUBRC <> 0.
        MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    ENDFORM.                    " f_DISPLAY_block
    *&      Form  f_create_catalog
          text
         -->P_C_TAB  text
         -->P_0085   text
         -->P_0086   text
         -->P_SPACE  text
         -->P_0088   text
         -->P_5      text
         -->P_I_FIELD_CAT_PART[]  text
    FORM F_CREATE_CATALOG  USING  FP_I_TABNAME   TYPE
    SLIS_TABNAME
                                  FP_I_FIELDNAME TYPE SLIS_FIELDNAME
                                  FP_I_SELTEXT   TYPE
    SCRTEXT_L
                                  FP_I_DOSUM     TYPE
    CHAR1
                                  FP_I_JUST      TYPE C
                                  FP_I_OUTPUTLEN TYPE
    OUTPUTLEN
                                  FP_I_FCAT      TYPE
    SLIS_T_FIELDCAT_ALV.
    Record for field catalog
      DATA: L_REC_FCAT TYPE SLIS_FIELDCAT_ALV.
      L_REC_FCAT-TABNAME   = FP_I_TABNAME.
      L_REC_FCAT-FIELDNAME = FP_I_FIELDNAME.
      L_REC_FCAT-SELTEXT_L = FP_I_SELTEXT.
      L_REC_FCAT-DO_SUM    = 'X'.
    *l_rec_fcat-do_sum    = ' '.
      L_REC_FCAT-JUST      = FP_I_JUST.
      L_REC_FCAT-OUTPUTLEN = FP_I_OUTPUTLEN.
      L_REC_FCAT-DECIMALS_OUT = '2'.
      L_REC_FCAT-KEY          = '1'.
      APPEND L_REC_FCAT TO FP_I_FCAT.
    ENDFORM.                    " f_create_catalog
         Subroutines for Headings
    *&      Form  f_display_header_partA
          Display header for report for Part A
    *&      Form  top_of_page
          text
    -->  p1        text
    <--  p2        text
    FORM TOP_OF_PAGE .
      SKIP 1.
      WRITE:/25 ' Name of Company ',80 'RUN DATE' ,
    SY-DATUM.
    SKIP 1.
    WRITE:/60 'RUN DATE' , SY-DATUM.
      SKIP 1.
      DATA: YR(4) TYPE N,
             FIN_PRD(10) TYPE C.
      IF S_DTABF-LOW+4(2) LT '04'.
        YR = S_DTABF-LOW+0(4) - 1.
        CONCATENATE YR '-' S_DTABF-LOW+2(2) INTO FIN_PRD.
      ELSE.
        YR = S_DTABF-LOW+0(4) + 1.
        CONCATENATE S_DTABF-LOW0(4) '-' YR2(2) INTO
    FIN_PRD.
      ENDIF.
      WRITE:/5 'DETAILS OF THE MONTH/DATE WISE DESPATCHES
    MADE BY ROAD/RAIL DURING THE YEAR ' , FIN_PRD  .
      SKIP 1.
    WRITE :/ 'SALES OFFICE : ' , P_SALES,' ' , RNAME.
    SKIP 1.
    ENDFORM.                    " DISPLAY_DATA
    Award Points If Useful...

  • Unable to export Hive managed table using Oraloader

    Hi,
    I am using MySQL as Hive metastore and trying to export a Hive managed table using Oraloader.
    I get the following excpetion in Jobtracker:
    2012-09-12 12:23:56,337 INFO org.apache.hadoop.mapred.JobTracker: Job job_201209121205_0007 added successfully for user 'oracle' to queue 'default'
    2012-09-12 12:23:56,338 INFO org.apache.hadoop.mapred.AuditLogger: USER=oracle IP=192.168.1.5 OPERATION=SUBMIT_JOB TARGET=job_201209121205_0007 RESULT=SUCCESS
    2012-09-12 12:23:56,353 INFO org.apache.hadoop.mapred.JobTracker: Initializing job_201209121205_0007
    2012-09-12 12:23:56,353 INFO org.apache.hadoop.mapred.JobInProgress: Initializing job_201209121205_0007
    2012-09-12 12:23:56,594 INFO org.apache.hadoop.mapred.JobInProgress: jobToken generated and stored with users keys in /opt/ladap/common/hadoop-0.20.2-cdh3u1/hadoop-datastore/mapred/system/job_201209121205_0007/jobToken
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: Input size for job job_201209121205_0007 = 5812. Number of splits = 2
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: tip:task_201209121205_0007_m_000000 has split on node:/default-rack/hadoop-namenode
    2012-09-12 12:23:56,606 INFO org.apache.hadoop.mapred.JobInProgress: tip:task_201209121205_0007_m_000001 has split on node:/default-rack/hadoop-namenode
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobInProgress: job_201209121205_0007 LOCALITY_WAIT_FACTOR=1.0
    2012-09-12 12:23:56,607 ERROR org.apache.hadoop.mapred.JobTracker: Job initialization failed:
    java.lang.NegativeArraySizeException
    at org.apache.hadoop.mapred.JobInProgress.initTasks(JobInProgress.java:748)
    at org.apache.hadoop.mapred.JobTracker.initJob(JobTracker.java:4016)
    at org.apache.hadoop.mapred.EagerTaskInitializationListener$InitJob.run(EagerTaskInitializationListener.java:79)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
    at java.lang.Thread.run(Thread.java:679)
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobTracker: Failing job job_201209121205_0007
    2012-09-12 12:23:56,607 INFO org.apache.hadoop.mapred.JobInProgress$JobSummary: jobId=job_201209121205_0007,submitTime=1347467036196,launchTime=1347467036607,,finishTime=1347467036607,numMaps=2,numSlotsPerMap=1,numReduces=0,numSlotsPerReduce=1,user=oracle,queue=default,status=FAILED,mapSlotSeconds=0,reduceSlotsSeconds=0,clusterMapCapacity=10,clusterReduceCapacity=2
    2012-09-12 12:23:56,639 INFO org.apache.hadoop.mapred.JobHistory: Moving file:/opt/ladap/common/hadoop/logs/history/hadoop-namenode_1347465941865_job_201209121205_0007_oracle_OraLoader to file:/opt/ladap/common/hadoop/logs/history/done
    2012-09-12 12:23:56,648 INFO org.apache.hadoop.mapred.JobHistory: Moving file:/opt/ladap/common/hadoop/logs/history/hadoop-namenode_1347465941865_job_201209121205_0007_conf.xml to file:/opt/ladap/common/hadoop/logs/history/done
    My oraloader console log is below:
    [oracle@rakesh hadoop]$ bin/hadoop jar oraloader.jar oracle.hadoop.loader.OraLoader -conf olh-conf/TestAS/scott/testmanagedtable/conf.xml -fs hdfs://hadoop-namenode:9000/ -jt hadoop-namenode:9001
    Oracle Loader for Hadoop Release 1.1.0.0.1 - Production
    Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
    12/09/12 12:23:42 INFO loader.OraLoader: Oracle Loader for Hadoop Release 1.1.0.0.1 - Production
    Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
    12/09/12 12:23:47 INFO loader.OraLoader: Sampling disabled, table: LDP_TESTMANAGEDTABLE is not partitioned
    12/09/12 12:23:47 INFO loader.OraLoader: oracle.hadoop.loader.loadByPartition is disabled, LDP_TESTMANAGEDTABLE is not partitioned
    12/09/12 12:23:47 INFO output.DBOutputFormat: Setting reduce tasks speculative execution to false for : oracle.hadoop.loader.lib.output.JDBCOutputFormat
    12/09/12 12:23:47 INFO loader.OraLoader: Submitting OraLoader job OraLoader
    12/09/12 12:23:50 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
    12/09/12 12:23:50 INFO metastore.ObjectStore: ObjectStore, initialize called
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
    12/09/12 12:23:51 ERROR DataNucleus.Plugin: Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
    12/09/12 12:23:51 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
    12/09/12 12:23:51 INFO DataNucleus.Persistence: Property javax.jdo.option.NonTransactionalRead unknown - will be ignored
    12/09/12 12:23:51 INFO DataNucleus.Persistence: ================= Persistence Configuration ===============
    12/09/12 12:23:51 INFO DataNucleus.Persistence: DataNucleus Persistence Factory - Vendor: "DataNucleus" Version: "2.0.3"
    12/09/12 12:23:51 INFO DataNucleus.Persistence: DataNucleus Persistence Factory initialised for datastore URL="jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true" driver="com.mysql.jdbc.Driver" userName="root"
    12/09/12 12:23:51 INFO DataNucleus.Persistence: ===========================================================
    12/09/12 12:23:52 INFO Datastore.Schema: Creating table `DELETEME1347467032448`
    12/09/12 12:23:52 INFO Datastore.Schema: Schema Name could not be determined for this datastore
    12/09/12 12:23:52 INFO Datastore.Schema: Dropping table `DELETEME1347467032448`
    12/09/12 12:23:52 INFO Datastore.Schema: Initialising Catalog "hive", Schema "" using "None" auto-start option
    12/09/12 12:23:52 INFO Datastore.Schema: Catalog "hive", Schema "" initialised - managing 0 classes
    12/09/12 12:23:52 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
    12/09/12 12:23:52 INFO DataNucleus.MetaData: Registering listener for metadata initialisation
    12/09/12 12:23:52 INFO metastore.ObjectStore: Initialized ObjectStore
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 11, column 6 : cvc-elt.1: Cannot find the declaration of element 'jdo'. - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 312, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 359, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 381, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 416, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 453, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 494, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 535, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 576, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 621, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 WARN DataNucleus.MetaData: MetaData Parser encountered an error in file "jar:file:/opt/ladap/common/hadoop-0.20.2-cdh3u1/lib/hive-metastore-0.7.1-cdh3u1.jar!/package.jdo" at line 666, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
    12/09/12 12:23:53 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : `DBS`, InheritanceStrategy : new-table]
    12/09/12 12:23:53 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : `DATABASE_PARAMS`]
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 index(es) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 0 foreign key(s) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 unique key(s) for table `DBS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 2 index(es) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 1 foreign key(s) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO Datastore.Schema: Validating 1 unique key(s) for table `DATABASE_PARAMS`
    12/09/12 12:23:54 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase
    12/09/12 12:23:54 INFO metastore.HiveMetaStore: 0: get_table : db=db_1 tbl=testmanagedtable
    12/09/12 12:23:54 INFO HiveMetaStore.audit: ugi=oracle     ip=unknown-ip-addr     cmd=get_table : db=db_1 tbl=testmanagedtable     
    12/09/12 12:23:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MSerDeInfo [Table : `SERDES`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [Table : `SDS`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MTable [Table : `TBLS`, InheritanceStrategy : new-table]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MSerDeInfo.parameters [Table : `SERDE_PARAMS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.parameters [Table : `TABLE_PARAMS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.partitionKeys [Table : `PARTITION_KEYS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.bucketCols [Table : `BUCKETING_COLS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.cols [Table : `COLUMNS`]
    12/09/12 12:23:54 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.parameters [Table : `SD_PARAMS`]
    12/09/12 12:23:55 INFO DataNucleus.Persistence: Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.sortCols [Table : `SORT_COLS`]
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 index(es) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 0 foreign key(s) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDES`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 4 index(es) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 foreign key(s) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 unique key(s) for table `TBLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SDS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SD_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `BUCKETING_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `COLUMNS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `PARTITION_KEYS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `TABLE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SERDE_PARAMS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 2 index(es) for table `SORT_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 foreign key(s) for table `SORT_COLS`
    12/09/12 12:23:55 INFO Datastore.Schema: Validating 1 unique key(s) for table `SORT_COLS`
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MSerDeInfo
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MStorageDescriptor
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MTable
    12/09/12 12:23:55 INFO DataNucleus.MetaData: Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MFieldSchema
    12/09/12 12:23:55 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    12/09/12 12:23:55 WARN snappy.LoadSnappy: Snappy native library not loaded
    12/09/12 12:23:55 INFO mapred.FileInputFormat: Total input paths to process : 1
    12/09/12 12:23:56 INFO mapred.JobClient: Running job: job_201209121205_0007
    12/09/12 12:23:57 INFO mapred.JobClient: map 0% reduce 0%
    12/09/12 12:23:57 INFO mapred.JobClient: Job complete: job_201209121205_0007
    12/09/12 12:23:57 INFO mapred.JobClient: Counters: 0
    [oracle@rakesh hadoop]$
    Please help. Thanks in advance.
    Regards,
    Rakesh Kumar Rakshit

    Hi Raskesh,
    Can you share the conf.xml and map.xml files you are using? I am trying to do the same (export from HIVE to oracle DB) and i get the following exception ClassNotFoundException: org.apache.hadoop.hive.metastore.TableType.
    Best regards,
    Bilal

  • Could not passivate; failed to save state

    Hi guys
    I'm using ejb with a web application on jboss 4.0.5 with jdk 1.5 and i'm getting this exception:
    11:53:50,390 WARN  [AbstractInstanceCache] failed to passivate, id=f373o8gy-s
    javax.ejb.EJBException: Could not passivate; failed to save state
            at org.jboss.ejb.plugins.StatefulSessionFilePersistenceManager.passivateSession(StatefulSessionFilePersistenceManager.java:423)
            at org.jboss.ejb.plugins.StatefulSessionInstanceCache.passivate(StatefulSessionInstanceCache.java:107)
            at org.jboss.ejb.plugins.AbstractInstanceCache.tryToPassivate(AbstractInstanceCache.java:209)
            at org.jboss.ejb.plugins.AbstractInstanceCache.tryToPassivate(AbstractInstanceCache.java:162)
            at org.jboss.ejb.plugins.LRUEnterpriseContextCachePolicy$OveragerTask.run(LRUEnterpriseContextCachePolicy.java:450)
            at java.util.TimerThread.mainLoop(Timer.java:512)
            at java.util.TimerThread.run(Timer.java:462)
    Caused by: java.io.NotSerializableException: java.lang.ThreadLocal
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1081)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1375)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1347)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1290)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1079)
            at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1375)
            at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1347)
            at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1290)
            at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1079)
            at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:302)
            at org.jboss.ejb.plugins.StatefulSessionFilePersistenceManager.passivateSession(StatefulSessionFilePersistenceManager.java:414)
            ... 6 moreMy code (Session bean statefull):
    protected transient Log log;
    private transient IdentitySession identitySession;
    public void ejbPassivate() throws EJBException, RemoteException {
            log.debug("bean passivating... ");
            log = null;
            identitySession.close();
            identitySession = null;
        }the attributes are transient because i had read somewhere that like that i could avoid serialization problems. but they remain, i've tried with and without them and the output is the same.
    IdentitySession is a non-serializable object and Log is a interface from commons logging also non-serializable.
    can anyone help?
    Message was edited by:
    RicardoM

    So those are the ONLY fields in your bean? Does the class declare a ThreadLocal field or is a
    super-class that declares a ThreadLocal field? If not, it could be a JBoss bug. I'd recommend
    posting on the JBoss forum.

Maybe you are looking for

  • Adobe Premiere Pro and After Effects CS5.5 Lagging Horribly with Dynamic Link

    I just bought a brand new computer. It is a beast Windows 7 64-bit Intel i7-980 6 cores with HT = 12 CPU cores Nvidia GTX 570HD 1280mb 328 bit 120gb SSD sata 3 600 gb Velociraptor drive 10000rpm 8gb DDR3 1866MHZ ram So you would think I would get no

  • "Other" category on iTunes for iPod touch 4g

    On iTunes, the "other" category on my ipod touch summary takes up 1.6 gb of my space, and I want to reduce that. Anybody know what the "other" category means, and how I can take off the space on it?

  • TS1436 Hi and Help.  I can't burn a CD on my I Mac.

    This is a new issue.  I have always been able to do it in the past.  Since updating to the new version of I-Tunes, I can't do it.  I am able to play a store bought CD and Video.  I am able to burn a disk with Photos on it, but I cannot burn a music C

  • Error While Installing SAP GUP

    Hi I got the following error NWSAPsetup was not able to grant your administrator rights either because -destribution service couldn't respond I have only one account with my system and initial there is no password for my pc then i give password to sy

  • Nokia 6300 alarm & ringtone volume

    the alarm tone is too weak and even the ringtone is not loud enuf.. any suggestions?