Offline data migration fails for BLOB field from MySQL 5.0 to 11g

I tried to use standalone Data Migration several years ago to move a database from MySQL to Oracle. At that time it was unable to migrate blob fields. I am trying again, hoping this issue might have been fixed in the mean time. That does not appear to be the case. The rows in question have a single BLOB field (it is a binary encoding of a serialized Java object, containing on the order of 1-2K bytes, a mixture of plain text and a small amount of non-ASCII data which is presumably part of the structure of the Java object). The mysqldump appears to correctly store the data, surrounded by the expected <EOFD> and <EORD> separators. The data as imported consists of a small (roughly 1-200) ASCII characters, apparently hex encoded, because if I do a hex dump of the mysqldump I can recognized some of the character pairs that appear in the blob field after import. However, they are apparently flipped within the word or otherwise displaced from each other (although both source and destinations machines are x86 family), and the imported record stops long before all the data is encoded.
For example, here is a portion of the record as imported:
ACED0005737200136A6
and here is a hex dump of the input
0000000 3633 3838 3037 3c39 4f45 4446 303e 3131
0000020 3036 3830 3836 453c 464f 3e44 312d 453c
0000040 464f 3e44 6e49 7473 7469 7475 6f69 446e
0000060 7461 3c61 4f45 4446 ac3e 00ed 7305 0072
0000100 6a13 7661 2e61 7475 6c69 482e 7361 7468
0000120 6261 656c bb13 250f 4a21 b8e4 0003 4602
0000140 0a00 6f6c 6461 6146 7463 726f 0049 7409
0000160 7268 7365 6f68 646c 7078 403f 0000 0000
AC ED appears in the 5th and 6th word of the 4th line, 00 05 in the 6th and 7th words, etc.
I see explicit references to using hex encoding for MS SQL and other source DB's, but not for mysql.
I suspect the encoder is hitting some character within the binary data that is aborting the encoding process, because so far the records I've looked at contain the same data (roughly 150 characters) for every record, and when I look at the binary input, it appears to be part of the Java object structure which may repeat for every record.
Here is the ctl code:
load data
infile 'user_data_ext.txt' "str '<EORD>'"
into table userinfo.user_data_ext
fields terminated by '<EOFD>'
trailing nullcols
internal_id NULLIF internal_id = 'NULL',
rt_number "DECODE(:rt_number, 'NULL', NULL, NULL, ' ', :rt_number)",
member_number "DECODE(:member_number, 'NULL', NULL, NULL, ' ', :member_number)",
object_type "DECODE(:object_type, 'NULL', NULL, NULL, ' ', :object_type)",
object_data CHAR(2000000) NULLIF object_data = 'NULL'
)

It looks like the data is actually being converted correctly. What threw me off was the fact that the mysql client displays the actual blob bytes, while sqlplus automatically converts them to hex for display, but only shows about 2 lines of the hex data. When I check field lengths they are correct.

Similar Messages

  • Data Migration program for opportunity

    Hi SAP Experts,
         Please can one help with the Data Migration program for opportunity.
    I want to know the approach.
    These are few mandatory filed
    1.  Opportunity Name
    2.  Account
    3.  Contact
    4.  Opportunity Owner
    5.  Total Estimate/Curr*  value_ currency
    6. Project start date   ---
    7. Stage 
    8. Probability 
    9. Status 
    10. Product with Item category ZSOL
    11. Service
    12.  Service Offering 
    Regards,
    Jaya

    Hi,
    As suggested by Kai you can use LSMW. This will be better option as it is very easy to use and you can also write routines while mapping the fields. You can also use IDOCs for importing data from Legacy system.
    If you want to write a report then you can use BAPI_BUSPROCESSND_CREATEMULTI. Please refer to the documentation of this BAPI. This BAPI internally calls CRM_ORDER_MAINTAIN. Do not forget to call BAPI_BUSPROCESSND_SAVE to commit your data.
    Regards,
    Sandeep

  • Data Migration Template for Singapore Payroll.

    Hi
    I need a help on data migration template for Singapore Payroll.
    Would also need guidance on what i srequired in case i want to migrate the last year payroll cluster data from legacy system to ECC 6.0.
    Will appreciate if you can send me the migration template at my mail id.
    Thanks and regards

    Hai..
    Check some of the fields mentioned below.. it may be of some use to u..
    Client
    Personnel Number
    Sequential number for payroll period
    Payroll type
    Payroll Identifier
    Pay date for payroll result
    Period Parameters
    Payroll Year
    Payroll Period
    Start date of payroll period (FOR period)
    End of payroll period (for-period)
    Reason for Off-Cycle Payroll
    Sequence Number
    Client
    Personnel Number
    Sequence Number
    Country grouping
    Wage type
    Key date
    Rate
    Number
    Amount

  • Error when reading BLOB field from Oracle usin Toplink

    We experience a very annoying problem when trying to read a BLOB
    field from Oracle 8.1.6.2.0 using TOPLink 3.6.3. I have attached the
    exception stack trace that is reported to the console. As far as I can
    judge a fault at oracle.sql.LobPlsqlUtil.plsql_length() happens first and
    then at TOPLink.Private.DatabaseAccess.DatabasePlatform.convertObject().
    The exception is permanently repeating that is very critical for us.
    ServerSession(929808)--Connection(5625701)--SELECT LOBBODY, ID, LABEL, FK_OBJECT_ID, FK_OBJECTTYPE FROM NOTE WHERE (ID = 80020)
    INTERNAL EXCEPTION STACK:
    java.lang.NullPointerException
    at oracle.sql.LobPlsqlUtil.plsql_length(LobPlsqlUtil.java:936)
    at oracle.sql.LobPlsqlUtil.plsql_length(LobPlsqlUtil.java:102)
    at oracle.jdbc.dbaccess.DBAccess.lobLength(DBAccess.java:709)
    at oracle.sql.LobDBAccessImpl.length(LobDBAccessImpl.java:58)
    at oracle.sql.BLOB.length(BLOB.java:71)
    at TOPLink.Private.Helper.ConversionManager.convertObjectToByteArray(ConversionManager.java:309)
    at TOPLink.Private.Helper.ConversionManager.convertObject(ConversionManager.java:166)
    at TOPLink.Private.DatabaseAccess.DatabasePlatform.convertObject(DatabasePlatform.java:594)
    at TOPLink.Public.Mappings.SerializedObjectMapping.getAttributeValue(SerializedObjectMapping.java:43)
    at TOPLink.Public.Mappings.DirectToFieldMapping.valueFromRow(DirectToFieldMapping.java:490)
    at TOPLink.Public.Mappings.DatabaseMapping.readFromRowIntoObject(DatabaseMapping.java:808)
    at TOPLink.Private.Descriptors.ObjectBuilder.buildAttributesIntoObject(ObjectBuilder.java:173)
    at TOPLink.Private.Descriptors.ObjectBuilder.buildObject(ObjectBuilder.java:325)
    at TOPLink.Private.Descriptors.ObjectBuilder.buildObjectsInto(ObjectBuilder.java:373)
    at TOPLink.Public.QueryFramework.ReadAllQuery.execute(ReadAllQuery.java:366)
    at TOPLink.Public.QueryFramework.DatabaseQuery.execute(DatabaseQuery.java:406)
    I have started the application with Oracle JDBC logging on and found that the problem may originate in a possible lack of syncronization in the pooled connection implementation:
    DRVR FUNC OracleConnection.isClosed() returned false
    DRVR OPER OracleConnection.close()
    DRVR FUNC OracleConnection.prepareCall(sql)
    DRVR DBG1 SQL: "begin ? := dbms_lob.getLength (?); end;"
    DRVR FUNC DBError.throwSqlException(errNum=73, obj=null)
    DRVR FUNC DBError.findMessage(errNum=73, obj=null)
    DRVR FUNC DBError.throwSqlException(reason="Logical handle no longer valid",
    SQLState=null, vendorCode=17073)
    DRVR OPER OracleConnection.close()
    so the prepareCall() is issued against an already closed connection and the
    call fails.
    I assume we have been using a JDBC 2.0 compliant driver. We tried out
    drivers that Oracle supplies for 8.1.6, 8.1.7 versions. To be true I
    couldn't find any information about the JDBC specification they conform to. Does it
    mean that these drivers may not be 100%-compatible with JDBC 2.0 Spec?
    How can I find out if they are 2.0 compliant?
    Also I have downloaded Oracle 9.2.0.1 JDBC drivers. This seemed to work
    fine until we found another incompatibility which made us return back to
    8.1.7 driver:
    UnitOfWork(7818028)--Connection(4434104)--INSERT INTO STATUSHISTORY (CHANGEDATE, FK_SET_STATUS_ID) VALUES ({ts '2002-10-17 16:46:54.529'}, 2)
    INTERNAL EXCEPTION STACK:
    java.sql.SQLException: ORA-00904: invalid column name
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java:289)
    at oracle.jdbc.ttc7.Oall7.receive(Oall7.java:573)
    at oracle.jdbc.ttc7.TTC7Protocol.doOall7(TTC7Protocol.java:1891)
    at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch(TTC7Protocol.java:1093
    at oracle.jdbc.driver.OracleStatement.executeNonQuery(OracleStatement.jav
    a:2047)
    at oracle.jdbc.driver.OracleStatement.doExecuteOther(OracleStatement.java
    :1940)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatemen
    t.java:2709)
    at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePrepare
    dStatement.java:589)
    at TOPLink.Private.DatabaseAccess.DatabaseAccessor.executeDirectNoSelect(
    DatabaseAccessor.java:906)
    at TOPLink.Private.DatabaseAccess.DatabaseAccessor.executeNoSelect(Databa
    seAccessor.java:960)
    at TOPLink.Private.DatabaseAccess.DatabaseAccessor.executeCall(DatabaseAc
    cessor.java:819)
    at TOPLink.Public.PublicInterface.UnitOfWork.executeCall(UnitOfWork.java:

    Hello Yury,
    I believe the problem is that TopLink's ServerSession by default executes read queries concurrently on the same connection. It does this to reduce the number of required connections for the read connection pool. Normally this is a good concurrency optimization, however some JDBC drivers have issues when this is done. I had thought that with Oracle JDBC 8.1.7 this issue no longer occurred, but perhaps it is only after version 9. I believe that the errors were only with the thin JDBC driver, not the OCI, so using the OCI driver should also resolve the problem. Using RAW instead of BLOB would also work.
    You can configure TopLink to resolve this problem through using exclusive read connection pooling.
    Example:
    serverSession.useExclusiveReadConnectionPool(int minNumerOfConnections, int maxNumerOfConnections);
    This will ensure that TopLink does not to try to concurrently execute read queries on the same connection.
    I'm not exactly sure what your second problem with the 9.x JDBC drivers is. From the SQL and stack trace it would seem that the driver does not like the JDBC timestamp syntax. You can have TopLink print timestamp in Oracle's native SQL format to resolve this problem.
    Example:
    serverSessoin.getLogin().useNativeSQL();
    Make sure you configure your server session before you login, or if using the TopLink Session Manager perform the customizations through a SessionEventListener-preLogin event.

  • Inserting Image into BLOB field from URL

    Is it possible to insert an image into a BLOB field from a URL? I don't have access to the filesystem on the server and I have a page that uses PL/PDF to generate dynamic PDF files from data I pull from the database. I would like to include my companies official header with logo on the top of the PDFs I generate and need to somehow get the image which I currently have uploaded to the images section in Application Express and can access from the "Shared Components => Images" section of the Application Builder. I just want to be able to either query whatever table holds those images or copy the image I need to a blob field in one of my tables so I can use it in my PDF document.
    The other alternative (which is what I first mentioned above) would be to try and perform an insert into the BLOB field in my table but the location of the image file would be like "http://www.myportal.com:7777/pls/htmldb/wwv_flow_file_mgr.get_file?p_security_group_id=1234567&p_flow_id=111&p_fname=myimage.jpg".
    Is this possible? If not, what can I do?
    Thanks in advance.

    The reason why I am asking this question in this forum is two-fold:
    1. I am using Application Express (AE).
    2. Since I am using AE all images that I upload to my workspace are stored in a table in a database and not as an actual file in a directory structure like in Windows etc. This being the case, I can't ever reference any of the files that I upload to my workspace in order to insert them into my BLOB field unless I know their physical location on disk (which of course does not exist b/c they are stored in the AE database somewhere). If I knew AE's database and table structure I could do a "select into ..." from whatever table stores my uploaded images and save those images to my own custom table in my BLOB field.

  • Passing values to subreport in SSRS throwing an error - Data Retrieval failed for the report, please check the log for more details.

    Hi,
    I have the subreport calling from the main report. The subreport is based on MDX query agianst the SSAS cube. some dimensions in cube has values 0 and 1.
    when I try to pass '0' to the sub report as the parameter value, it gives an error "Data Retrieval failed for the report, please check the log for more details".
    Actually I am using table for storing these parameter values. In the main report I am calling this table (dataset) and passing these values to subreport.
    so I have given like [0],[1] and this works fine. when I give only either [0] or [1] then it is throwing an error.
    Could you please advise on this.
    Appreciate all and any help.
    Thanks,
    Divya

    Hi Divya,
    Based on the current description, I understand that there is no issue if you pass two values from main report to subreport, while the issue occurs when passing one value to subreport.
    To narrow down the issue, I want to confirm whether the subreport can run if there is only [0] or [1] in the subreport. If so, it indicates the query statements exist error in the subreport. If it’s not the case, this shows the issue occurs during passing
    values from main report to subreport. To make further analysis, please post the details of query statements of the subreport to the forum.
    Regards,
    Heidi Duan
    Heidi Duan
    TechNet Community Support

  • Data migration approach for Scheduling Agreements

    Gurus,
    Can anyone provide guidance on the data migration approach for Scheduling Agreements? How can we migrate the open delivery schedules and the respective cumulative quantities? The document type being used is "LZ".
    Can correction deliveries (Doc type - LFKO) be used update the initial cumulative quantities?
    Any help in this regard is higly appreciated.
    Regards,
    Gajendra

    Hi Zenith
    You might find useful information here: IS-U data export (extraction) for EMIGALL.
    I've done numerous IS-U migrations in the last 10+ years, but never one from IS-U to IS-U.
    One main questions to start with: Are the two systems on the same release level?
    If they are not and it would be too much effort to get them onto the same release level I would think creating your own extract programs and using the Migration workbench (EMIGALL) is the way to go.
    If they are on the same level, there might possible other ways to do it, dependent on the differences between the two systems (mainly around customising). Assuming there are significant differences - otherwise why would you bother migrating? - there's the high probability that using EMIGALL is the way to go as well.
    Yep
    Jürgen

  • Value Help for a field from custom table in BSP

    Please let me know, the procedure to create value help for a field from a custom table in BSP using HTMLB.
    Thanks

    hi prodigy,
    check this code using drop down list.
    <b>layout for first page</b> 
    <htmlb:dropdownListBox id                = "ddlist"
                                  table             = "<%=t_mara%>"
                                  nameOfKeyColumn   = "matnr"
                                  nameOfValueColumn = "matnr"></htmlb:dropdownListBox>
    <b>event Oninitialization event of first page</b>
    SELECT MATNR UP TO 10 ROWS
      INTO TABLE T_MARA
      FROM MARA.
    <b> event Oninputprocessing event</b>
    **Load the manager class
    CLASS CL_HTMLB_MANAGER DEFINITION LOAD.
    DATA: V_EVENT TYPE REF TO CL_HTMLB_EVENT.
    DATA: DATA TYPE REF TO OBJECT.
    DATA: V_DATA TYPE REF TO CL_HTMLB_DROPDOWNLISTBOX.
    CALL METHOD CL_HTMLB_MANAGER=>GET_EVENT
      EXPORTING
        REQUEST               = RUNTIME->SERVER->REQUEST
      RECEIVING
        EVENT                 = V_EVENT
    IF V_EVENT->NAME = 'button' AND
       V_EVENT->EVENT_TYPE = 'click'.
    DATA ?= CL_HTMLB_MANAGER=>GET_DATA( REQUEST =
                                          RUNTIME->SERVER->REQUEST
                                          NAME     = 'inputField'
                                          ID       = 'matnr'
        V_DATA ?= DATA.
    IF V_DATA IS NOT INITIAL.
         V_MATNR = V_DATA->SELECTION.
    ENDIF.
    NAVIGATION->SET_PARAMETER( NAME = 'v_matnr' VALUE = V_MATNR ).
    NAVIGATION->GOTO_PAGE( 'detail.htm' ).
    ENDIF.
    <b>in the onInitialization event for detail page</b>SELECT MATNR
           ERSDA
           ERNAM
           LAEDA
           AENAM
           VPSTA
           PSTAT
           LVORM
           MTART
      FROM MARA
      INTO TABLE T_MARA
    WHERE MATNR = V_MATNR.

  • Drop down in the selection for the field from the table

    Hi.
    i want to put the drop down for the field from the table fo which i dont know the number of entries for the field zregion1 of the table zbwcntry.
    please tell me how to use the function module and what could be the line of codes.
    the drop down is for the select option on the slection screen of the classical report.
    please help .

    HI,
    Check below code..it may help you.
    REPORT Zbunu .
    TYPES :BEGIN OF STR ,
           MATNR TYPE MATNR ,
           END OF STR .
    DATA : ITAB TYPE TABLE OF STR WITH HEADER LINE,
           VAR TYPE MARA-MATNR .
    PARAMETERS : S_MATNR TYPE MATNR ,
                  P_MATNR TYPE MATNR.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR S_MATNR .
    SELECT MATNR FROM MARA INTO
    CORRESPONDING FIELDS OF TABLE ITAB UP TO 1000 ROWS.
    CALL FUNCTION 'F4IF_INT_TABLE_VALUE_REQUEST'
      EXPORTING
      DDIC_STRUCTURE         = ' '
        retfield               = 'MATNR'
      PVALKEY                = ' '
       DYNPPROG               = SY-CPROG
       DYNPNR                 = SY-DYNNR
       DYNPROFIELD            = 'S_MATNR'
      STEPL                  = 0
      WINDOW_TITLE           =
      VALUE                  = ' '
       VALUE_ORG              = 'S'
      MULTIPLE_CHOICE        = ' '
      DISPLAY                = 'X'
      CALLBACK_PROGRAM       = ' '
      CALLBACK_FORM          = ' '
      tables
        value_tab              = ITAB
      FIELD_TAB              =
      RETURN_TAB             =
      DYNPFLD_MAPPING        =
    EXCEPTIONS
       PARAMETER_ERROR        = 1
       NO_VALUES_FOUND        = 2
       OTHERS                 = 3
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    Regards
    Ansumesh

  • Order Import Failed in OM : Log Validation failed for the field - Ship To

    Problem in Order Management
    When i tried to do Order import in Source Org logfiles shows the Message :Validation failed for the field - Ship To, Please if anyone have a solution of this.
    Thanks in advance.
    Regards,
    Rajesh Verma
    Senior Consultant- Oracle Apps.
    COLT

    perhaps you might want to try populating customer_id instead of name to make sure that there is no error on typing customer name.
    primary ship-to and bill-to must exist under this customer. During order import, if ship-to is not specified then import will fetch primary ST of customer.
    This is what we use to populate interface table -with mininum of data.
    INSERT INTO oe_headers_iface_all
    (orig_sys_document_ref,order_source_id,org_id
    ,order_type_id,payment_term_id, shipping_method_code, freight_terms_code
    ,customer_po_number,salesrep_id
    ,sold_to_org_id, ship_to_org_id,invoice_to_org_id,sold_to_contact_id
    ,booked_flag
    ,created_by, creation_date, last_updated_by, last_update_date,last_update_login
    ,operation_code, order_category
    ,attribute5,tp_attribute4,xml_message_id,xml_transaction_type_code
    ,request_id, error_flag)
    INSERT INTO oe_lines_iface_all
    (order_source_id, orig_sys_document_ref, orig_sys_line_ref,orig_sys_shipment_ref
    ,inventory_item,item_type_code,line_type_id
    ,top_model_line_ref,link_to_line_ref,component_sequence_id,component_code,option_flag
    ,ordered_quantity
    ,order_quantity_uom,salesrep_id
    ,created_by, creation_date, last_updated_by, last_update_date,last_update_login
    ,operation_code,cust_model_serial_number,line_category_code
    ,context,attribute6
    ,reference_type, reference_line_id, reference_header_id
    ,return_context, return_attribute1, return_attribute2
    ,return_reason_code
    ,tp_attribute1,tp_attribute2,tp_attribute3,tp_attribute4,tp_attribute5
    ,request_id,error_flag)

  • Validation failed for the field - Bill To Contact, Ship To Contact

    Hi,
    When I am trying to Import / enter order for a specific customer, i am getting the error.
    VALIDATION FAILED FOR THE FIELD - BILL TO CONTACT
    VALIDATION FAILED FOR THE FIELD - SHIP TO CONTACT
    Has anyone got this error earlier? If you know the resolution, please update me on this.

    Hi,
    I am getting following error when I an adding one customer address to another customer through Order Import Program. By populating 3 table values
    1. OE_CUSTOMER_INFO_IFACE_ALL
    2. OE_HEADERS_IFACE_ALL
    3. OE_LINES_IFACE_ALL
    Error Message:
    Duplicate SHIP_TO ADDRESS found for IC_804810. Please correct the data.
    Duplicate BILL_TO ADDRESS found for IC_804810. Please correct the data.
    But its working fine when I am giving New address for existing customer.
    Please help me on this....
    Thanks,
    Oracle Support.

  • Data Retreival Failed for subreport

    Hi all
    I am getting an error when trying to embed a subreport (VS 2013)
    The message I get is:
    Data retrieval failed for the subreport, 'rptBlahBlah' located at project.rptBlahBlah.rdlc. Please check the log files for more information.
    Assuming the logs are the ones in C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\LogFiles then there are no relevant error messages or anything to indicate anything failed.
    I have removed all parameters from the subreport and the query that should be loading the data but still the same.
    I have put the subreport directly into a reportviewer and it shows up just fine so the report definition and the dataset must be OK.
    Where else/What else can I look at?
    Thanks

    Hi Anil,
    This is a know issue which you can see in the link:https://connect.microsoft.com/SQLServer/feedback/details/648560/subreport-with-shared-dataset-throws-error
    This issue occurs only on Business Integrated Development Studio (BIDS) and use a shared dataset in subreport. And it is fixed in SQL Server Reporting Services 2012. So you can avoid this issue by changing the shared dataset to embedded dataset or deploy the
    shared dataset to Report Server and view report on Report Manage.
    If you have any questions, please feel free to ask.
    Regards,
    Charlie Liao
    TechNet Community Support

  • How to handle "Validation failed for the field - Tax code" issue?

    We had mass uplaod the order that create on Mar with tax code effective date on Apr. Now we would like return on this order  and getting error of "Validation failed for the field - Tax code". How to handle this issue?

    Hi
    You will have to check if the Tax_Code of RMA being received is the same as the one in the related sales order.
    If not you will need to use the same tax_code.
    Refer below document : Doc ID 1584338.1

  • How to find table name for the fields from Standard Extractor in CRM system

    How to find table name of fields from the standard extractor in CRM system ?
    e.g. We use LBWE TCode in R/3 system to find table name for the field from Extractor VCSCL(e.g.).
    Likewise is there any way to find table name for the fields from Standard extractor like 0CRM_LEAD_I.

    Hi ,
    Please find the link below for understanding BW CRM analysis.
    http://help.sap.com/bp_biv135/html/bw.htm
    activate the CRM DSs by scenario:
    1) Activate the application component hierarchy (tcode RSA9). Changes made to the application component hierarchy in the CRM system can be transferred to the BW using the "Edit Application Component Hierarchy" (SBIW - Postprocessing of DataSources).
    SAP Note 434886 must be implemented in CRM 3.0 before the application component hierarchy is activated.
    2) Activate the Business Content DataSources (tcode RSA5).
    Select/enter the application component and choose Execute (F8).
    To compare the shipped and active versions, choose the 'Select Delta' pushbutton. If there is no active version of the DataSource, it is selected automatically.
    To activate the shipped version, choose the 'Transfer DataSources' pushbutton.
    3) Management of the versions of the BW-Adapter metadata (tcode BWA5). All DataSources are displayed that are managed by the BW Adapter.
    As in transaction RSA5 (Service API Metadata Activation), the 'Select Delta' function can be used to select the inactive DataSources or compare shipped and active versions.
    You can also go directly to the screen for maintaining DataSources that are managed by the BW Adapter.
    The 'Compare Version' function makes a detailed comparison of the shipped and active versions.
    All BW-Adapter metadata is considered when versions are compared:
    Header information (Table SMOXHEAD)
    Mapping information (Table SMOXRELP)
    Global selection conditions (Table SMOXGSEL)
    Attribute key fields (Table SMOXAFLD)
    Hope this helps.
    Regards,
    csm reddy

  • Order Import - Validation failed for the field - End Customer Location

    I am trying to import an order using standard import program and it is failing with below error:
    - Validation failed for the field - End Customer Location
    I have checked metalink to debug this and found nothing, the values i have used to populate the interface table or valid.
    Can someone please post your views on this, on where the problem might be.
    Thanks!

    Hello,
    Try the following
    1) Check if any processing constraints are applied to customer/location
    2) Try to create a new customer and associate internal location.
    Create a new order to reproduce the issue
    Thanks
    -Arif

Maybe you are looking for