Meaning of Fields in SXMB_MONI

Hi All,
I have some confusion about the information displayed in transaction SXMB_MONI. Can you help to confirm whether my understanding is correct?
1. In synchronous scenario, one typical Caller -> XI -> Callee procedure will have 4 messages logged in SXMB_MONI. They are as following:
Sender (message sent from sender adapter to IE)
  Response (resposne message from IE to sender adapter)
Central (message sent from IE to receiver adapter)
  Response (message sent from receiver adapter to IE)
Sender and central messages share a message ID. The two response messages share one ID.
Is my understanding correct?
2. In asynchronous scenario, one typical Sender -> XI -> Receiver procedure has 1 message logged.
That message contains 2 parts CENTRAL and PE_ADAPTER. What's the meaning of these two parts?
That message can be divided into inbound message, receiver grouping, response and call adapter. What's the meaning of each part?
Thanks + Best Regards
Jerome

Thanks for the explanation.
I checked the pipeline SAP_CENTRAL. It contains the following pipeline steps:
Receiver Identification
Interface Determination
Message Branch According to Receiver List
Request Message Mapping
Technical Routing
Call Adapter
Request Message Mapping
But in SXMB_MONI, I double click on the "central" item in the left-top window and didn't find the last pipeline step "Request Message Mapping". By double clicking on the "response" item, I found there's a "Request Message Mapping" after the "Call Adapter". May I aske what's the reason for such a display?
Thanks, Jerome
Message was edited by: Jerome Zhao

Similar Messages

  • Meaning of field  upgrade in table smodilog

    Hi all ,
    Do you know what is the meaning of field UPGRADE in table smodilog ?
    the thing is that after upgrade of our test system , by using spau cr (that was imported during the upgrade in phase adjust_prp)
    when I  goto SPAU I see that the objects that appear there got the right version , but they are not adjusted .
    If I delete the value from filed UPGRADE in table smodilog fo r one of these objects , The object is set to adjusted in SPAU.
    Any ideas ,
    Regards ,
    Oren

    Hi
    The field SATZA means RECORD TYPE for confirmation & it gets updated when RECORD TYPES are copied from table TEVEN to AFRU.
    Check the SAP standard report CORUSART.
    Hope this helps.
    Regards
    Abhii

  • What is the mean IDDAT field of ORDRSP idoc's E1EDP01/E1EDP03 segment

    Hi experts,
    I'm trying to use ORDRSP idoc to map to a target structure.
    I need to know mean of IDDAT field of ORDRSP idoc's E1EDP01/E1EDP03 segment.
    For example if IDDAT = 024  what does it mean?
    Is there a list shows means of fields with different values?
    Thanks.

    Hi,
    the segment E1EDP03 is a  "qualified" segment.
    The possible values of a qualifier field are defined in DDIC as domain value range (fixed values or by value table)  of the corresponding data element.
    In your case, the qualifier (data element) IDDAT has the domain DATID assigned with the T54C5 value table.
    So the possible values are contained in the value table T54C5.
    Using the maintenance transaction SM30 itu2019s possible change the value list; for this reason check in your system the meaning of the qualifier IDDAT = u2018024u2019.
    Regards,
    Andrea

  • Meaning of fields...........

    Hello Sir/Madam,
             lpz tell me, in table VBAK what is the meaning of "distribution channal" , "division" & "plant".

    Distribution Channel
    Technical Name: 0CP_CHANNEL
    Use
    The characteristic indicates the distribution channel that a consumer products manufacturer chooses to sell their products. The characteristic references 0BP_GRP.
    In SAP Business Process 
      Sales Area= Sales Org.Dist.ChannelDivision.
      Its Triangular intraction
                              Sales organization
         Dist.Channel<----
    >Division
    Sales Org controls Distribution Channel and Division
    Distribution Channel: The channel that is choosen the organization to make their product reach the end customer.(Network)
    Division: Ex: HLL----
    >  Detergents, Out of Home,Food Products,Health Care Sectors/Departments 
                  TATA Motars: Trucks/Bus,Cars,Heavy Vechiles Etc...
    Please note it is not:
        Sales organization
                |
                |            
        Dist.Channel
                |
                |
          Division
    Division is an Oragaizational Unit.
    The Division in the Material Master is not an Organizational Unit that can be used to maintain related fields.  It is a field which is used to uniquely assign a material to a Division.

  • Meaning only field does not work when querying

    Hi
    I am working with Designer 6i R4.11
    In my generated forms I have columns/fields based on "domains/lovs". The domain property "dynamic list" is set to yes and I have all my values in a cg_ref_codes table.
    Now when I am doing a query in my form and I use the lov-button of the "domain-field", the query will work fine. When I just type in the value in the field without using the lov, the query will always return all values.
    There is no trigger or code generated to fill in the database-column value in query-mode, so form a form builder point of view it makes sense.
    I have the impression this used to work years ago. Is there a property I didn't set or am I missing something?
    Thanks in advance for your help.
    Karine

    Seems to me that you need to set the display width to be wide enough for all possible meanings in the LOV, rather than wide enough for the field itself. But I may have misunderstood the question. or misremembered the answer - it has been a while.

  • Masking the Single Field in SXMB_MONI

    Hi,
    I have a Requirement where I am getting Credit card number in one of my Scenario PI as Middle ware.I want to Mask the Credit card Number which is coming from the payload and after processing I want to Unmask it and Send to Receiver.
    Are there any Module Beans available in PI to achieve this(Masking and Unmasking) ?
    If not then How can I achieve this?
    Thanks &Regards,
    Deepthi.M

    Hi Deepthi,
    As said by Roy, there is no standard adapter module to mask/unmask the field.
    There is another possibility, encrypting the complete payload and decrypting. This will be highly secured.
    Ref: file encription and decription XML file
    Thanks,

  • Meaning of field GUID of a CRM Order object in Datasource 0CRM_CONTACT_OUT

    Dear Sirs,
    The standard datasource Customer Contact: Outbound (0CRM_CONTACT_OUT) delivers a field Guid of a CRM Order object (0CRM_0HGUID).
    What does this field acctually give you?
    Is it a GUID of a acctual order created in CRM for the given BP (in a given marketing element, targetggroup)
    best regards,
    Jørgen

    if you search in the SAP Help you find technical information of the extractor with table & field name of each extracted field... copying the link won't work as it's not completely "shown", but this is what's I found:
    Field in Extract Structure
    Description of Field in the Extract Structure
    Table of Origin
    Field in Table of Origin
    PARTNER
    Business partner number
    CRMD_IM_ML_ITEM
    PARTNER_GUID
    CAMPAIGN_ELEMENT
    Project planning: 16 character GUID for tasks
    CRM_IM_ML_HEAD
    ELEMENT_GUID
    EXTERNAL_ID
    Project planning: external ID for an element
    CGPL
    PROJECT or TASK
    CHANNEL
    CRM marketing planning – communication channel
    CRMD_IM_ML_ITEM
    CHANNEL
    SURVEYID
    CRM surveys: survey ID
    CRM_MKTPL_ATTR
    OBJECTIVE
    SURVEYVERSION
    CRM surveys: survey version
    CRM_MKTPL_ATTR
    OBJECTIVE
    TARGETGRP_GUID
    CRM marketing: GUID for a target group
    CRMD_IM_ML_ITEM
    TARGETGRP_GUID
    ORDER_GUID
    GUID of a CRM order object
    CRMD_IM_ML_ITEM
    ORDER_GUID
    OBJECT_TYPE
    Business transaction category
    Function module: CRM_ORDER_READ
    PROCESS_TYPE
    Business transaction type
    Function module: CRM_ORDER_READ
    ELM_ITEM_GUID
    CRM marketing:
    line GUID for a BP
    list item
    CRMD_IM_ML_ITEM
    ELM_ITEM_GUID
    CREATED_AT
    Transaction was created at this time
    CRMD_IM_ML_ITEM
    CREATED_AT
    CHANGED_AT
    Time of last change to the transaction
    CRMD_IM_ML_ITEM
    CHANGED_AT
    SUM_ACCESS
    Number of times link accessed
    CRMD_IM_ML_ITEM
    SUM_ACCESS
    SUM_REPLIES
    Number of e-mail answers
    CRMD_IM_ML_ITEM
    SUM_REPLIES

  • 64bit version of firefox, i mean mine field has so many bugs

    MineField beta 4 (FireFox x64) has many bugs,
    i don't know, maybe it's so common, because in every update none of them were fixed,
    for example my bookmark not only doesn't work, but also it clean up my previous one each time i close and open mine field,
    i couldn't bookmark any link, even though if i drag and drop links into bookmark pane, it shows it worked, but it doesn't in reality,
    another issue that is so annoying is feedback part, you couldn't send any feedback like firefox, there's no feedback button at the right corner of toolbar, even if you select feedback from top menu, it redirects you to this page
    http://input.mozilla.com/en-US/download
    sooooooooooooo great,
    i thing it's better to use firefox x86 on every x64 system for many reasons, it also doesn't have any flash player, cuz adobe is still working on x64 version of flash player on browser, i think it's about 7 years that they are working on it, i think it's enough time to change the whole parts of INTERNET, not only another version of a simple player
    another issue is that always at the middle of my work, my display driver stop working, i downloaded the latest one, but i found it out that it is an issue related to MINEFIELD and nothing else.
    (i have 64 bit systems since 7 years ago and always i have many problem with softwares and applications, but i've never have any problem with firefox ( at least i think) till these days with mine field,)
    best regards

    My advice is to stick with the latest release version of Firefox 3.6.12, and leave the 64-bit pre-release builds alone until they are actually released. At this point the 64-builds of Minefield aren't anywhere near ready for use by the average user, and the other software may not be refined enough and might be contributing to your problems.

  • How can i reinstall the search field of my Imac? I mean the field where one enters a web address.

    How can I reinstall the serach field of my Imac?

    Choose Show Toolbar or Customize Toolbar from the View menu and put it back.
    (90586)

  • Unable to capture the screen field in badi

    Dear Experts,
    I have my requirement something like this.
    The t code which we are using is cj20n.
    I need to capture the date fields whcih are in  2 tab and when i press save these fields needs to be captured and validations are to be  done accordingly.
    I have found the badi name as WORKBREAKDOWN_UPDATE and when i put some break point this method trigeers but only few fields are only captured except the second tab values ( i mean date fields ).
    SO please suggest me with a solution as wht needs to be done in my case.
    Regards,
    Madhavi.

    try using import or export parameter id .

  • Pls help : How To select fields and data from user_table for each tablename

    Please help with the query to generate a output which selects the code,meaning,inuse for each table in the user_table that has "CODED" as a part of table name.
    User table has some 800 table that contains CODED in the tablename.
    Desc of the table:
    DESCPTION:
    Name Null? Type
    SHORT_NAME NOT NULL VARCHAR2(20)
    CODE NOT NULL VARCHAR2(4)
    MEANING NOT NULL VARCHAR2(240)
    IN_USE VARCHAR2(1)
    NOTES VARCHAR2(2000
    UNITS NOT NULL VARCHAR2(1)
    AMOUNT NOT NULL VARCHAR2(3)
    CONVERTED VARCHAR2(1)
    RUN_NAME VARCHAR2(30)
    But all the table have code, meaning,in_use fields.
    O/P format :
    TABLE_NAME CODE MEANING IN_USE
    Help me pls.

    Not 100% sure what you want. If you want to see all the tables that have all three of those columns, then you could do something like:
    SELECT table_name, 'CODE', 'MEANING', 'IN_USE'
    FROM user_tab_columns
    WHERE column_name = 'CODE' and
          table_name like '%CODED%'
    INTERSECT
    SELECT table_name, 'CODE', 'MEANING', 'IN_USE'
    FROM user_tab_columns
    WHERE column_name = 'MEANING' and
          table_name like '%CODED%'
    INTERSECT
    SELECT table_name, 'CODE', 'MEANING', 'IN_USE'
    FROM user_tab_columns
    WHERE column_name = 'INUSE' and
          table_name like '%CODED%'If you want to select those three columns from each of the tables, then you could do something like this.
    Create a command file called, for example, makesel.sql that looks like:
    SET PAGES 0 lines 500 trimspool on feedback off;
    spool sel.sql;
    prompt spool selout.txt;
    SELECT 'SELECT '''||table_name||''', code, meaning, in_use FROM '||
           table_name||';'
    FROM (SELECT table_name
          FROM user_tab_columns
          WHERE column_name = 'CODE' and
                table_name like '%CODED%'
          INTERSECT
          SELECT table_name
          FROM user_tab_columns
          WHERE column_name = 'MEANING' and
                table_name like '%CODED%'
          INTERSECT
          SELECT table_name
          FROM user_tab_columns
          WHERE column_name = 'INUSE' and
                table_name like '%CODED%')
    prompt 'spool off;'
    spool off;
    @sel.sqlAt the sqlplus prompt run the file using @makesel.sql. This will create another file called sel.sql containing the commands to select those three columns from each table that has all three columns, then after the new file is created, it runs the file (@sel.sql). The output will be spooled to a file called selout.txt.
    HTH
    John

  • I am trying to extract two more fields from 2lis_13_vditm

    Dear Experts,
    I have some tough situation here.
    In BW: i am using 2lis_13_vditm this datasource for the last 5 years. i am running daily deltas to load(10 fields from r/3) ods which is in <b>addition mode.</b>.
    from this ods data is going to another 4 ods's.
    I have client requirement that he wants to add ship to party and sold to party fields to this existing ods. that means 12 fields we need from r/3 to bw now.
    other end at r3 ; 2lis_13_vditm has already 12 fields.
    Come to the point:
    1) if i run full reapir  now data get aggregated. ruled out.
    2) if i want to run re-init  but i can't have block out period in r/3 ; do this is also rule d out.
    3) If i run init without data ; then delta ;  it takes more time ;
    If you can help me out on this situation it would be great helpful to me
    Rds,
    kk

    Something similar to this should work filling in historical data.
    1. Create new, temporary ODS to contain the key fields and the additional data fields.
    2. Run logistics set up in R/3 with to new fields added to your data source.
    3. Load data into new ODS.
    4. Create update rules to load from new ODS to existing ODS, updating only key fields and new data fields.
    5. Load from new ODS to existing ODS. At the end of this step all historical info is there.
    6. Continue with normal delta process into existing ODS.
    If you don't want a lengthy setup run, you can consider creating generic data source to pull the data.  You don't even need the temporary ODS. Just load directly into existing ODS to fill in the new data fields.

  • Time field is not appering in report!

    Hi,
    we have a strange situation where for a particular field, i am not seing any values ( all zeroes ) in report!  ( that particular field is filled via FM logic , i mean no field as such in R/3 )
    the thing is that this particular field has values in cube.
    it is a key figure defined as number dec in the definition.
    what could be the problem?
    Thanks,
    Ravi

    Hi,
    Hope you are very sure that you have seen the values in the cube.
    Then do you have a multiprovider on which you've created the query?
    Then you must re check the identifications.
    If the query is basd on the cube then run transaction RSRT and see the output of the query.
    This way you should be able to know exactly where the prob is.
    If all seems well we'll concentrate on your query design.
    Regards,
    Sharmishtha

  • [Forum FAQ] How to use multiple field terminators in BULK INSERT or BCP command line

    Introduction
    Some people want to know if we can have multiple field terminators in BULK INSERT or BCP commands, and how to implement multiple field terminators in BULK INSERT or BCP commands.
    Solution
    For character data fields, optional terminating characters allow you to mark the end of each field in a data file with a field terminator, as well as the end of each row with a row terminator. If a terminator character occurs within the data, it is interpreted
    as a terminator, not as data, and the data after that character is interpreted and belongs to the next field or record. I have done a test, if you use BULK INSERT or BCP commands and set the multiple field terminators, you can refer to the following command.
    In Windows command line,
    bcp <Databasename.schema.tablename> out “<path>” –c –t –r –T
    For example, you can export data from the Department table with bcp command and use the comma and colon (,:) as one field terminator.
    bcp AdventureWorks.HumanResources.Department out C:\myDepartment.txt -c -t ,: -r \n –T
    The txt file as follows:
    However, if you want to bcp by using multiple field terminators the same as the following command, which will still use the last terminator defined by default.
    bcp AdventureWorks.HumanResources.Department in C:\myDepartment.txt -c -t , -r \n -t: –T
    The txt file as follows:
    When multiple field terminators means multiple fields, you use the below comma separated format,
    column1,,column2,,,column3
    In this occasion, you only separate 3 fields (column1, column2 and column3). In fact, after testing, there will be 6 fields here. That is the significance of a field terminator (comma in this case).
    Meanwhile, using BULK INSERT to import the data of the data file into the SQL table, if you specify terminator for BULK import, you can only set multiple characters as one terminator in the BULK INSERT statement.
    USE <testdatabase>;
    GO
    BULK INSERT <your table> FROM ‘<Path>’
     WITH (
    DATAFILETYPE = ' char/native/ widechar /widenative',
     FIELDTERMINATOR = ' field_terminator',
    For example, using BULK INSERT to import the data of C:\myDepartment.txt data file into the DepartmentTest table, the field terminator (,:) must be declared in the statement.
    In SQL Server Management Studio Query Editor:
    BULK INSERT AdventureWorks.HumanResources.DepartmentTest FROM ‘C:\myDepartment.txt’
     WITH (
    DATAFILETYPE = ‘char',
    FIELDTERMINATOR = ‘,:’,
    The new table contains like as follows:  
    We could not declare multiple field terminators (, and :) in the Query statement,  as the following format, a duplicate error will occur.
    In SQL Server Management Studio Query Editor:
    BULK INSERT AdventureWorks.HumanResources.DepartmentTest FROM ‘C:\myDepartment.txt’
     WITH (
    DATAFILETYPE = ‘char',
    FIELDTERMINATOR = ‘,’,
    FIELDTERMINATOR = ‘:’
    However, if you want to use a data file with fewer or more fields, we can implement via setting extra field length to 0 for fewer fields or omitting or skipping more fields during the bulk copy procedure.  
    More Information
    For more information about filed terminators, you can review the following article.
    http://technet.microsoft.com/en-us/library/aa196735(v=sql.80).aspx
    http://social.technet.microsoft.com/Forums/en-US/d2fa4b1e-3bd4-4379-bc30-389202a99ae2/multiple-field-terminators-in-bulk-insert-or-bcp?forum=sqlgetsta
    http://technet.microsoft.com/en-us/library/ms191485.aspx
    http://technet.microsoft.com/en-us/library/aa173858(v=sql.80).aspx
    http://technet.microsoft.com/en-us/library/aa173842(v=sql.80).aspx
    Applies to
    SQL Server 2012
    SQL Server 2008R2
    SQL Server 2005
    SQL Server 2000
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    Thanks,
    Is this a supported scenario, or does it use unsupported features?
    For example, can we call exec [ReportServer].dbo.AddEvent @EventType='TimedSubscription', @EventData='b64ce7ec-d598-45cd-bbc2-ea202e0c129d'
    in a supported way?
    Thanks! Josh

  • Changing the length of a key field in a table

    Hi,
    I want to increase the length of the field from 2 to 4 in a standard SAP table and deliver it to the customers. This field is a key field in table. This field from this table is also used in view and view clusters.
    What is the implication of changing the length to the customers. The customers would have already data in this field and they should not loose any data. Will the existing data for customers remain at length 2 or do they have to do some conversion?
    Regards,
    Srini.
    Edited by: Srinivasa Raghavachar on Feb 7, 2008 12:45 PM

    hi,
    The database table can be adjusted to the changed definition in the ABAP Dictionary in three different
    ways:
    By deleting the database table and creating it again. The table on the database is deleted, the inactive
    table is activated in the ABAP Dictionary, and the table is created again on the database. Data
    existing in the table is lost.
    By changing the database catalog (ALTER TABLE). The definition of the table on the database is
    simply changed. Existing data is retained. However, indexes on the table might have to be built again.
    By converting the table. This is the most time-consuming way to adjust a structure.
    If the table does not contain any data, it is deleted in the database and created again with its new
    structure. If data exists in the table, there is an attempt to adjust the structure with ALTER TABLE. If the
    database system used is not able to do so, the structure is adjusted by converting the table.
    Field 1     Field 2,    Field 3
    NUMC,6  CHAR 8    CHAR, 60
    Field 1    Field 2       Field 3
    NUMC,6 CHAR, 8     CHAR,30
    The following example shows the steps necessary during conversion.
    Starting situation: Table TAB was changed in the ABAP Dictionary. The length of field 3 was reduced
    from 60 to 30 places.
    The ABAP Dictionary therefore has an active (field 3 has a length of 60 places) and an inactive (field 3
    still has 30 places) version of the table.
    The active version of the table was created in the database, which means that field 3 currently has 60
    places in the database. A secondary index with the ID A11, which was also created in the database, is
    defined for the table in the ABAP Dictionary.
    The table already contains data.
    Step 1: The table is locked against further structure changes. If the conversion terminates due to an
    error, the table remains locked. This lock mechanism prevents further structure changes from being
    made before the conversion has been completed correctly. Data could be lost in such a case.
    Step 2: The table in the database is renamed. All the indexes on the table are deleted. The name of the
    new (temporary) table is defined by the prefix QCM and the table name. The name of the temporary
    Step 3: The inactive version of the table is activated in the ABAP Dictionary. The table is created on the
    database with its new structure and with the primary index. The structure of the database table is the
    same as the structure in the ABAP Dictinary after this step. The database table, however, does not
    contain any data.
    The system also tries to set a database lock for the table being converted. If the lock is set, application
    programs cannot write to the table during the conversion.
    The conversion is continued, however, even if the database lock cannot be set. In such a case
    application programs can write to the table. Since in such a case not all of the data might have been
    loaded back into the table, the table data might be inconsistent.
    You should therefore always make sure that no applications access the table being converted
    during the conversion process.
    Step 4: The data is loaded back from the temporary table (QCM table) to the new table (with MOVECORRESPONDING).
    The data exists in the database table and in the temporary table after this step.
    When you reduce the size of fields, for example, the extra places are truncated when you reload the
    data.
    Since the data exists in both the original table and temporary table during the conversion, the storage
    requirements increase during the process. You should therefore verify that sufficient space is available in
    the corresponding tablespace before converting large tables.
    There is a database commit after 16 MB when you copy the data from the QCM table to the original
    table. A conversion process therefore needs 16 MB resources in the rollback segment. The existing
    database lock is released with the Commit and then requested again before the next data area to be
    converted is edited.
    When you reduce the size of keys, only one record can be reloaded if there are several records whose
    key cannot be distinguished. It is not possible to say which record this will be. In such a case you should
    clean up the data of the table before converting.
    Step 5: The secondary indexes defined in the ABAP Dictionary for the table are created again.
    Step 6: The temporary table (QCM table) is deleted.
    Step 7: The lock set at the beginning of the conversion is deleted.
    If the conversion terminates, the table remains locked and a restart log is written.
    Caution: The data of a table is not consistent during conversion. Programs therefore should not access
    the table during conversion. Otherwise a program could for example use incorrect data when reading the
    table since not all the records were copied back from the temporary table. Conversions therefore
    should not run during production! You must at least deactivate all the applications that use tables to
    be converted.
    You must clean up terminated conversions. Programs that access the table might otherwise run
    incorrectly. In this case you must find out why the conversion terminated (for example overflow of the
    corresponding tablespace) and correct it. Then continue the terminated conversion.
    Since the data exists in both the original table and temporary table during conversion, the storage
    requirements increase during conversion. If the tablespace overflows when you reload the data from the
    temporary table, the conversion will terminate. In this case you must extend the tablespace and start the
    conversion in the database utility again.
    If you shorten the key of a table (for example when you remove or shorten the field length of key fields),
    you cannot distinguish between the new keys of existing records of the table. When you reload the data
    from the temporary table, only one of these records can be loaded back into the table. It is not possible
    to say which record this will be. If you want to copy certain records, you have to clean up the table
    before the conversion.
    During a conversion, the data is copied back to the database table from the temporary table with the
    ABAP statement MOVE-CORRESPONDING. Therefore only those type changes that can be executed
    with MOVE-CORRESPONDING are allowed. All other type changes cause the conversion to be
    terminated when the data is loaded back into the original table. In this case you have to recreate the old
    state prior to conversion. Using database tools, you have to delete the table, rename the QCM table to
    its old name, reconstruct the runtime object (in the database utility), set the table structure in the
    Dictionary back to its old state and then activate the table.
    If a conversion terminates, the lock entry for the table set in the first step is retained. The table can no
    longer be edited with the maintenance tools of the ABAP Dictionary (Transaction SE11).
    A terminated conversion can be analyzed with the database utility (Transaction SE14) and then
    resumed. The database utility provides an analysis tool with which you can find the cause of the error
    and the current state of all the tables involved in the conversion.
    You can usually find the precise reason for termination in the object log. If the object log does not
    provide any information about the cause of the error, you have to analyze the syslog or the short dumps.
    If there is a terminated conversion, two options are displayed as pushbuttons in the database utility:
    After correcting the error, you can resume the conversion where it terminated with the Continue
    adjustment option.
    There is also the Unlock table option. This option only deletes the existing lock entry for the table .
    You should never choose Unlock table for a terminated conversion if the data only exists in the
    temporary table, i.e. if the conversion terminated in steps 3 or 4. table for table TAB is therefore QCMTAB.
    Hope this is helpful,Do reward.

Maybe you are looking for