Problem in HRMD_A Idocs: Object is Locked

Hi Everyone,
I would like to ask for assistance on the problem below.
We've encountered almost everyday failed HRMD_A idocs. Almost all of the idocs have the same error message: "Object is Locked. A locked key has been set for the object."
The object type is "P".
Can you help us on how to solve this problem?
Thanks in advance!

Hi,
For the locked object the sender system itself will not send the idoc.
and hence the error for the lock object might have been seen in the sender system and not the receiver system.
Yes you can process the idocs which are in error in the receiver system by frequently scheduling reports like
RBDMANI2-Schedule update of idocs with errors and
RBDAPP01- for ready for transfer idocs
I suggest you use the bapi way of transferring data, where the receiving system reads data from the sender system once a day and reads only that data which is changed during that date.
Regards,
Divya

Similar Messages

  • Problem regarding HRMD_A IDoc as outbound posting

    Hi,
    I have a requirement to post the data for HIRING (PA40 - hiring action) as IDOC, for which i am using HRMD_A.
    Client's specific requirement is to Post the data, as soon as the user completes with the hiring of the employee through PA40.
    I tried to add the logic in the Save of the Infotype, but which infotype is the last screen, how we can confirm, because all the Infotypes are not compulsory, he may skip the screen.
    How to trigger the IDoc creation....
    Can anyone help to find the triggering point for the same or any other alternative.
    Basically client dont want to schedule a job that would take care of the posting at particular decided time and neither for Change Pointers....
    Is there any work around for the same?
    Any suggestions would be of great help.

    Set change pointer active for HRMD_A, turn direct processing on in the partner profile, schedule a job every n minutes to create idocs from change pointers. And choose the correct forum for this, cause this has nothing to do with abap.

  • Service template problem - Unable to perform the job because one or more of the selected objects are locked by another job - ID 2606

    Hello,
    I’ve finally managed to deploy my first guest cluster with a shared VHDX using a service template. 
    So, I now want to try and update my service template.  However, whenever I try to do anything with it, in the services section, I receive the error:
    Unable to perform the job because one or more of the selected objects are locked by another job.  To find out which job is locking the object, in the jobs view, group by status, and find the running or cancelling job for the object.  ID 2606
    Well I tried that and there doesn’t seem to be a job locking the object.  Both the cluster nodes appear to be up and running, and I can’t see a problem with it at all.  I tried running the following query in SQL:
    SELECT * FROM [VirtualManagerDB].[dbo].[tbl_VMM_Lock] where TaskID='Task_GUID'
    but all this gives me is an error that says - conversion failed when converting from a character string to uniqueidentifier msg 8169, level 16, State 2, Line 1
    I'm no SQL expert as you can probably tell, but I'd prefer not to deploy another service template in case this issue occurs again.
    Can anyone help?

    No one else had this?

  • Problem in receving IDOCS

    I have a problem receving an IDOC. I checked the SM58 Xn and it says " Error when opening RFC connection "
    So I reprocessed the IDOC by transaction WE19 and then BD87 where SAP says it has sent the IDOC and staus is set to 03.
    However I again checked SM58 this time it says " RfcAbort : Cannot lock transaction " ........
    any idea or pointers as what the problem is or might be?
    Below is the RFC Server code which i am using......it is the sample code that comes along with the  sapjco.jar
    Thanks a lot
    package milestone.ups;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.util.Hashtable;
    import java.util.Map;
    import java.util.Properties;
    import com.sap.conn.jco.JCoException;
    import com.sap.conn.jco.JCoFunction;
    import com.sap.conn.jco.ext.DestinationDataProvider;
    import com.sap.conn.jco.ext.ServerDataProvider;
    import com.sap.conn.jco.server.DefaultServerHandlerFactory;
    import com.sap.conn.jco.server.JCoServer;
    import com.sap.conn.jco.server.JCoServerContext;
    import com.sap.conn.jco.server.JCoServerContextInfo;
    import com.sap.conn.jco.server.JCoServerErrorListener;
    import com.sap.conn.jco.server.JCoServerExceptionListener;
    import com.sap.conn.jco.server.JCoServerFactory;
    import com.sap.conn.jco.server.JCoServerFunctionHandler;
    import com.sap.conn.jco.server.JCoServerState;
    import com.sap.conn.jco.server.JCoServerStateChangedListener;
    import com.sap.conn.jco.server.JCoServerTIDHandler;
    public class StepByStepServer
        static String SERVER_NAME1 = "SERVER";
        static String DESTINATION_NAME1 = "ABAP_AS_WITHOUT_POOL";
        static String DESTINATION_NAME2 = "ABAP_AS_WITH_POOL";
        static MyTIDHandler myTIDHandler = null;
        static
            Properties connectProperties = new Properties();
            connectProperties.setProperty(DestinationDataProvider.JCO_ASHOST, " PRIVATE_IP ");
            connectProperties.setProperty(DestinationDataProvider.JCO_SYSNR, "01");
            connectProperties.setProperty(DestinationDataProvider.JCO_CLIENT, "001");
            connectProperties.setProperty(DestinationDataProvider.JCO_USER, " usr");
            connectProperties.setProperty(DestinationDataProvider.JCO_PASSWD, "pwd ");
            connectProperties.setProperty(DestinationDataProvider.JCO_LANG, "en");
            createDataFile(DESTINATION_NAME1, "jcoDestination", connectProperties);
            connectProperties.setProperty(DestinationDataProvider.JCO_POOL_CAPACITY, "3");
            connectProperties.setProperty(DestinationDataProvider.JCO_PEAK_LIMIT, "10");
            createDataFile(DESTINATION_NAME2, "jcoDestination", connectProperties);
            Properties servertProperties = new Properties();
            servertProperties.setProperty(ServerDataProvider.JCO_GWHOST, " PRIVATE_IP ");
            servertProperties.setProperty(ServerDataProvider.JCO_GWSERV, "sapgw01");
            servertProperties.setProperty(ServerDataProvider.JCO_PROGID, "JCO_SERVER");
            servertProperties.setProperty(ServerDataProvider.JCO_REP_DEST, "ABAP_AS_WITH_POOL");
            servertProperties.setProperty(ServerDataProvider.JCO_CONNECTION_COUNT, "2");
            createDataFile(SERVER_NAME1, "jcoServer", servertProperties);
        static void createDataFile(String name, String suffix, Properties properties)
            File cfg = new File(name + "." + suffix);
            if(!cfg.exists())
                try
                    FileOutputStream fos = new FileOutputStream(cfg, false);
                    properties.store(fos, "for tests only !");
                    fos.close();
                catch(Exception e)
                    throw new RuntimeException("Unable to create the destination file " + cfg.getName(), e);
        static class StfcConnectionHandler implements JCoServerFunctionHandler
            public void handleRequest(JCoServerContext serverCtx, JCoFunction function)
                System.out.println("----
                System.out.println("call              : " + function.getName());
                System.out.println("ConnectionId      : " + serverCtx.getConnectionID());
                System.out.println("SessionId         : " + serverCtx.getSessionID());
                System.out.println("TID               : " + serverCtx.getTID());
                System.out.println("repository name   : " + serverCtx.getRepository().getName());
                System.out.println("is in transaction : " + serverCtx.isInTransaction());
                System.out.println("is stateful       : " + serverCtx.isStatefulSession());
                System.out.println("----
                System.out.println("gwhost: " + serverCtx.getServer().getGatewayHost());
                System.out.println("gwserv: " + serverCtx.getServer().getGatewayService());
                System.out.println("progid: " + serverCtx.getServer().getProgramID());
                System.out.println("----
                System.out.println("attributes  : ");
                System.out.println(serverCtx.getConnectionAttributes().toString());
                System.out.println("----
                System.out.println("CPIC conversation ID: " + serverCtx.getConnectionAttributes().getCPICConversationID());
                System.out.println("----
                System.out.println("req text: " + function.getImportParameterList().getString("REQUTEXT"));
                function.getExportParameterList().setValue("ECHOTEXT", function.getImportParameterList().getString("REQUTEXT"));
                function.getExportParameterList().setValue("RESPTEXT", "Hello World");
                // In sample 3 (tRFC Server) we also set the status to executed:
                if(myTIDHandler != null)
                    myTIDHandler.execute(serverCtx);
        static class MyThrowableListener implements JCoServerErrorListener, JCoServerExceptionListener
            public void serverErrorOccurred(JCoServer jcoServer, String connectionId, JCoServerContextInfo serverCtx, Error error)
                System.out.println(">>> Error occured on " + jcoServer.getProgramID() + " connection " + connectionId);
                error.printStackTrace();
            public void serverExceptionOccurred(JCoServer jcoServer, String connectionId, JCoServerContextInfo serverCtx, Exception error)
                System.out.println(">>> Error occured on " + jcoServer.getProgramID() + " connection " + connectionId);
                error.printStackTrace();
        static class MyStateChangedListener implements JCoServerStateChangedListener
            public void serverStateChangeOccurred(JCoServer server, JCoServerState oldState, JCoServerState newState)
                // Defined states are: STARTED, DEAD, ALIVE, STOPPED;
                // see JCoServerState class for details.
                // Details for connections managed by a server instance
                // are available via JCoServerMonitor
                System.out.println("Server state changed from " + oldState.toString() + " to " + newState.toString() + " on server with program id "
                        + server.getProgramID());
        static void step2SimpleServer()
            JCoServer server;
            try
                server = JCoServerFactory.getServer(SERVER_NAME1);
            catch(JCoException ex)
                throw new RuntimeException("Unable to create the server " + SERVER_NAME1 + ", because of " + ex.getMessage(), ex);
            JCoServerFunctionHandler stfcConnectionHandler = new StfcConnectionHandler();
            DefaultServerHandlerFactory.FunctionHandlerFactory factory = new DefaultServerHandlerFactory.FunctionHandlerFactory();
            factory.registerHandler("STFC_CONNECTION", stfcConnectionHandler);
            server.setCallHandlerFactory(factory);
            // additionally to step 1
            MyThrowableListener eListener = new MyThrowableListener();
            server.addServerErrorListener(eListener);
            server.addServerExceptionListener(eListener);
            MyStateChangedListener slistener = new MyStateChangedListener();
            server.addServerStateChangedListener(slistener);
            server.start();
            System.out.println("The program can be stoped using <ctrl>+<c>");
        static class MyTIDHandler implements JCoServerTIDHandler
            Map<String, TIDState> availableTIDs = new Hashtable<String, TIDState>();
            public boolean checkTID(JCoServerContext serverCtx, String tid)
                // This example uses a Hashtable to store status information. But usually
                // you would use a database. If the DB is down, throw a RuntimeException at
                // this point. JCo will then abort the tRFC and the R/3 backend will try
                // again later.
                System.out.println("TID Handler: checkTID for " + tid);
                TIDState state = availableTIDs.get(tid);
                if(state == null)
                    availableTIDs.put(tid, TIDState.CREATED);
                    return true;
                if(state == TIDState.CREATED || state == TIDState.ROLLED_BACK)
                    return true;
                return false;
                // "true" means that JCo will now execute the transaction, "false" means
                // that we have already executed this transaction previously, so JCo will
                // skip the handleRequest() step and will immediately return an OK code to R/3.
            public void commit(JCoServerContext serverCtx, String tid)
                System.out.println("TID Handler: commit for " + tid);
                // react on commit e.g. commit on the database
                // if necessary throw a RuntimeException, if the commit was not
                // possible
                availableTIDs.put(tid, TIDState.COMMITTED);
            public void rollback(JCoServerContext serverCtx, String tid)
                System.out.println("TID Handler: rollback for " + tid);
                availableTIDs.put(tid, TIDState.ROLLED_BACK);
                // react on rollback e.g. rollback on the database
            public void confirmTID(JCoServerContext serverCtx, String tid)
                System.out.println("TID Handler: confirmTID for " + tid);
                try
                    // clean up the resources
                // catch(Throwable t) {} //partner wont react on an exception at
                // this point
                finally
                    availableTIDs.remove(tid);
            public void execute(JCoServerContext serverCtx)
                String tid = serverCtx.getTID();
                if(tid != null)
                    System.out.println("TID Handler: execute for " + tid);
                    availableTIDs.put(tid, TIDState.EXECUTED);
            private enum TIDState
                CREATED, EXECUTED, COMMITTED, ROLLED_BACK, CONFIRMED;
        public static void main(String[] a)
            // step1SimpleServer();
            step2SimpleServer();
            // step3SimpleTRfcServer();

    thx sameer.
    I am sending IDOCs when delivery is created i.e through transaction VL01N /VL02N.
    I dont know for some reason i dont have mw ppackage in my sapjco.jar. I got from the sap market place.
    I modified the code to handle IDOC request as u sugguested. I have put system.out in the code to check whether handler is being invovked? But it didnt when I run the code.
    I tried to reprocess the IDOC through we19....but they failed(as shown in sm58).
    Here is the modified code....
    package milestone.ups;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.io.OutputStreamWriter;
    import java.util.Hashtable;
    import java.util.Map;
    import java.util.Properties;
    import com.sap.conn.idoc.IDocDocumentList;
    import com.sap.conn.idoc.IDocXMLProcessor;
    import com.sap.conn.idoc.jco.JCoIDoc;
    import com.sap.conn.idoc.jco.JCoIDocHandler;
    import com.sap.conn.idoc.jco.JCoIDocHandlerFactory;
    import com.sap.conn.idoc.jco.JCoIDocServerContext;
    import com.sap.conn.jco.JCoException;
    import com.sap.conn.jco.ext.DestinationDataProvider;
    import com.sap.conn.jco.ext.ServerDataProvider;
    import com.sap.conn.jco.server.JCoServer;
    import com.sap.conn.jco.server.JCoServerContext;
    import com.sap.conn.jco.server.JCoServerContextInfo;
    import com.sap.conn.jco.server.JCoServerErrorListener;
    import com.sap.conn.jco.server.JCoServerExceptionListener;
    import com.sap.conn.jco.server.JCoServerState;
    import com.sap.conn.jco.server.JCoServerStateChangedListener;
    import com.sap.conn.jco.server.JCoServerTIDHandler;
    import com.sap.conn.idoc.jco.*;
    public class StepByStepServer
        static String SERVER_NAME1 = "SERVER";
        static String DESTINATION_NAME1 = "ABAP_AS_WITHOUT_POOL";
        static String DESTINATION_NAME2 = "ABAP_AS_WITH_POOL";
        static MyTIDHandler myTIDHandler = null;
        static
            Properties connectProperties = new Properties();
            connectProperties.setProperty(DestinationDataProvider.JCO_ASHOST, "172.31.64.74");
            connectProperties.setProperty(DestinationDataProvider.JCO_SYSNR, "01");
            connectProperties.setProperty(DestinationDataProvider.JCO_CLIENT, "001");
            connectProperties.setProperty(DestinationDataProvider.JCO_USER, "US9904");
            connectProperties.setProperty(DestinationDataProvider.JCO_PASSWD, "us9904");
            connectProperties.setProperty(DestinationDataProvider.JCO_LANG, "en");
            createDataFile(DESTINATION_NAME1, "jcoDestination", connectProperties);
            connectProperties.setProperty(DestinationDataProvider.JCO_POOL_CAPACITY, "3");
            connectProperties.setProperty(DestinationDataProvider.JCO_PEAK_LIMIT, "10");
            createDataFile(DESTINATION_NAME2, "jcoDestination", connectProperties);
            Properties servertProperties = new Properties();
            servertProperties.setProperty(ServerDataProvider.JCO_GWHOST, "172.31.64.74");
            servertProperties.setProperty(ServerDataProvider.JCO_GWSERV, "sapgw01");
            servertProperties.setProperty(ServerDataProvider.JCO_PROGID, "JCO_SERVER");
            servertProperties.setProperty(ServerDataProvider.JCO_REP_DEST, "ABAP_AS_WITH_POOL");
            servertProperties.setProperty(ServerDataProvider.JCO_CONNECTION_COUNT, "2");
            createDataFile(SERVER_NAME1, "jcoServer", servertProperties);
        static void createDataFile(String name, String suffix, Properties properties)
            File cfg = new File(name + "." + suffix);
            if(!cfg.exists())
                try
                    FileOutputStream fos = new FileOutputStream(cfg, false);
                    properties.store(fos, "for tests only !");
                    fos.close();
                catch(Exception e)
                    throw new RuntimeException("Unable to create the destination file " + cfg.getName(), e);
        static class MyIDocHandler implements JCoIDocHandler
            public void handleRequest(JCoServerContext serverCtx, IDocDocumentList idocList)
                 System.out.println("IN Handler");   - THIS DIDNT PRINT on the CONSOLE             FileOutputStream fos=null;
                OutputStreamWriter osw=null;
                  try
                       IDocXMLProcessor xmlProcessor =
                            JCoIDoc.getIDocFactory().getIDocXMLProcessor();
                    fos=new FileOutputStream(serverCtx.getTID()+"_idoc.xml");
                    osw=new OutputStreamWriter(fos, "UTF8");
                       xmlProcessor.render(idocList, osw,
                                 IDocXMLProcessor.RENDER_WITH_TABS_AND_CRLF);               
                       osw.flush();
                  catch (Throwable thr)
                       thr.printStackTrace();
                finally
                    try
                        if (osw!=null)
                            osw.close();
                        if (fos!=null)
                            fos.close();
                    catch (IOException e)
                        e.printStackTrace();
        static class MyIDocHandlerFactory implements JCoIDocHandlerFactory
             private JCoIDocHandler handler = new MyIDocHandler();
             public JCoIDocHandler getIDocHandler(JCoIDocServerContext serverCtx)
                  System.out.println("Handler Object created and invoked");
                  return handler;
        static class MyThrowableListener implements JCoServerErrorListener, JCoServerExceptionListener
            public void serverErrorOccurred(JCoServer jcoServer, String connectionId, JCoServerContextInfo serverCtx, Error error)
                System.out.println(">>> Error occured on " + jcoServer.getProgramID() + " connection " + connectionId);
                error.printStackTrace();
            public void serverExceptionOccurred(JCoServer jcoServer, String connectionId, JCoServerContextInfo serverCtx, Exception error)
                System.out.println(">>> Error occured on " + jcoServer.getProgramID() + " connection " + connectionId);
                error.printStackTrace();
        static class MyStateChangedListener implements JCoServerStateChangedListener
            public void serverStateChangeOccurred(JCoServer server, JCoServerState oldState, JCoServerState newState)
                // Defined states are: STARTED, DEAD, ALIVE, STOPPED;
                // see JCoServerState class for details.
                // Details for connections managed by a server instance
                // are available via JCoServerMonitor
                System.out.println("Server state changed from " + oldState.toString() + " to " + newState.toString() + " on server with program id "
                        + server.getProgramID());
        static void step2SimpleServer()
             JCoIDocServer server;
            try
                server = JCoIDoc.getServer(SERVER_NAME1);
            catch(JCoException ex)
                throw new RuntimeException("Unable to create the server " + SERVER_NAME1 + ", because of " + ex.getMessage(), ex);
            //JCoServerFunctionHandler stfcConnectionHandler = new StfcConnectionHandler();
            server.setIDocHandlerFactory(new MyIDocHandlerFactory());
            //factory.registerHandler("STFC_CONNECTION", stfcConnectionHandler);
            // additionally to step 1
            MyThrowableListener eListener = new MyThrowableListener();
            server.addServerErrorListener(eListener);
            server.addServerExceptionListener(eListener);
            MyStateChangedListener slistener = new MyStateChangedListener();
            server.addServerStateChangedListener(slistener);
            server.start();
            System.out.println("Server Started");
            System.out.println("The program can be stoped using <ctrl>+<c>");
        static class MyTIDHandler implements JCoServerTIDHandler
            Map<String, TIDState> availableTIDs = new Hashtable<String, TIDState>();
            public boolean checkTID(JCoServerContext serverCtx, String tid)
                // This example uses a Hashtable to store status information. But usually
                // you would use a database. If the DB is down, throw a RuntimeException at
                // this point. JCo will then abort the tRFC and the R/3 backend will try
                // again later.
                System.out.println("TID Handler: checkTID for " + tid);
                TIDState state = availableTIDs.get(tid);
                if(state == null)
                    availableTIDs.put(tid, TIDState.CREATED);
                    return true;
                if(state == TIDState.CREATED || state == TIDState.ROLLED_BACK)
                    return true;
                return false;
                // "true" means that JCo will now execute the transaction, "false" means
                // that we have already executed this transaction previously, so JCo will
                // skip the handleRequest() step and will immediately return an OK code to R/3.
            public void commit(JCoServerContext serverCtx, String tid)
                System.out.println("TID Handler: commit for " + tid);
                // react on commit e.g. commit on the database
                // if necessary throw a RuntimeException, if the commit was not
                // possible
                availableTIDs.put(tid, TIDState.COMMITTED);
            public void rollback(JCoServerContext serverCtx, String tid)
                System.out.println("TID Handler: rollback for " + tid);
                availableTIDs.put(tid, TIDState.ROLLED_BACK);
                // react on rollback e.g. rollback on the database
            public void confirmTID(JCoServerContext serverCtx, String tid)
                System.out.println("TID Handler: confirmTID for " + tid);
                try
                    // clean up the resources
                // catch(Throwable t) {} //partner wont react on an exception at
                // this point
                finally
                    availableTIDs.remove(tid);
            public void execute(JCoServerContext serverCtx)
                String tid = serverCtx.getTID();
                if(tid != null)
                    System.out.println("TID Handler: execute for " + tid);
                    availableTIDs.put(tid, TIDState.EXECUTED);
            private enum TIDState
                CREATED, EXECUTED, COMMITTED, ROLLED_BACK, CONFIRMED;
        public static void main(String[] a)
            // step1SimpleServer();
            step2SimpleServer();
            // step3SimpleTRfcServer();

  • ALE HRMD_A Idocs in status 52

    HRMD_A Idocs in target system are not completely posted, turning into status '52'. Most of the times the infotype 1001 idoc with relationship P->S arrives before the idoc which has the actual objects(P, S) causing 'Relationship imposible because the object does not exist".
    We use program RBDAPP01 for inbound processing of Idoc which should take care of Serialization based on the time stamp of idoc creation in source system.
    Change pointors for Message type HRMD_A are processed using program RBDMIDOC in source system.
    It is unlikely that relationship P->S is created without object P being created in the source system. So the change pointors should create the idocs in the same sequence. But it is not happening. Any body experinced similar problem?
    Any help on this appreciated.

    The question here is if all necessary data has been already transferred to target system . If data is selected based on change pointer than only data that has changed in source system is send to target.
    Example: imagine you have change employee position using one of already created positions in the system. Change pointer will take than only the new assignment between employee and positions. It will not actually transfer the existing position it-self. In this case you must be sure that all existing positions are already transferred to target system.
    In your example only new relationship between P->S is transfered to target. Employee data should already been there. The change here is not about creating new employee but only about the new relationship P->S.
    Best Regards,
    Andrzej

  • Problem with INVOIC01 idoc.

    Hi,
    We are receing IDOC INVOIC01 (Vendor invoice in MM). My problem is we have to identify the segment and field where barcode number can be populated in this inbound idoc. Also how to pass this barcode number to MIRO/MIRA.
    Can anyone could throw some light on this.
    -B S B

    Is barcode the same as UPC / EAN number?
    Take a look at the segment E1EDP19. You could use either EAN (same as UPC in the US, as far as I understand) or re-use any other ID mentioned there that is not otherwise used in your system.
    E1EDP19 : IDoc: Document Item Object Identification
    Structure
    Use
    The segment includes the material description.
    Different material descriptions for a material can be transferred via the qualifier.
    QUALF : IDOC object identification such as material no.,customer
    Values
    '001' Material number used by customer
    '002' Material number used by vendor
    '003' International Article Number(EAN)
    '004' Manufacturer part number
    '005' Interchanged manufacturer part number
    '006' Pricing reference material
    '007' Commodity code
    '010' Batch number
    '011' Country of origin of material
    '012' Shipping unit
    '013' Original material number (ALE)
    '014' Serial number
    '015' Manufacturing plant
    '016' Revision level
    '017' Additionals

  • Problem with HRMD_A06 idoc

    Dear All,
    I am using the IDOC HRMD_A06 for carrying out various operation on the available HR infotypes.Kindly go thru the below mentioned details for further insight of the problem
    1.Scenario:
    a.An interface to SAP HR system needs to be built
    b.The interface will supply incremental master-data in 
      form of flat-files
    c.An integration broker ,SAP XI will then convert the
      flat-files to I-Docs and send I-Docs to SAP HR system
    d.The I-Doc chosen for this purpose is HRMD_A.HRMD_A06
    e.However this I-Doc is not supporting different
      operations on incremental data like modify, delete etc
    f.Further validations are also not being supported
    g.Also segments for certain info-type records are missing
    Example for Delete Operation mode:
    <b>Operation Delete</b>: Delete existing infotype record
    <b>Expected Result</b>: Only the particular info-type record, as supplied in the I-Doc should be deleted and not the entire personal number with all the info-types
    <b>Actual Result</b>: The entire personal Number is deleted.
    Is it that the Idoc behaves in this fashion or there is some other flag apart from OPERATION, which needs to be set.
    Regards
    Vinay

    Hi Vinay,
    Have you found a solution ?
    Because I'm a similar problem by using IDoc HRMD_A06. I have not a segment E1P0021 when I delete the infotype 0021 on an employee. I have only its segment E1PITYP.
    Regards
    Mickael

  • The requested object is locked by another transaction

    Hi All,
    I got a problem while loading data from an ODS to another ODS, it shows 450 records but request is red, it says
    'The requested object is locked by another transaction'
    Message no. MC602
    Diagnosis
    A lock requested by calling an ENQUEUE function module cannot be provided because the object in question has already been locked by its own transaction.
    Technical Information: The C_ENQUEUE routine returns the following values:
    COLLISION_OBJECT = ERSMONFACT
    COLLISION_UNAME = SINGH
    System response
    The ENQUEUE function module triggers an appropriate exception. If this exception was not intercepted by the application program, it leads automatically to the active SAP transaction being cancelled.
    Procedure
    Once the ENQUEUE function module is called, the application program should intercept this exception and react to it appropriately.
    I tried SM12 but there is no lock entry, can u suggest some thing?
    Regards,

    Hello,
    here your dso seems to be lock as some read and write opertaion is happening at the time u r running the dtp.
    Here two things to look :
    1.Check the error that ur getting while running a dtp the process monitor etc try to find the request id which has locked the dso . Copy that request id go to t-code RSRQ see whats this request who is user,target etc..contact user for cancelling the job if it has been stuck for long long time. Copy the JOB ID from  RSRQ for that rquest go to SM37 give the date as there in RSRQ and kill the job
    2.GO to SM12 try to search the object there and remove the lock.
    Thanks,
    Deepak

  • HRMD_A IDOCs - Restrict PERNRs

    Hello,
    I would like to restrict a certain range of personnel numbers from being included in an IDOC.
    The current setup:
    1. HRMD_A distribution is configured and live in production with all PA and OM infotypes applicable
    2. The BADI HRALE00OUTBOUND_IDOC is enabled for certain customizaitons in the IDOC
    The required functionality:
    The personnel number range 80000000 to 89999999 should be restricted from the IDOCs. This includes:
    1. No PA infotypes for a PERNR in this number range should be sent out
    2. No Relationships to/from a PERNR in this number range should be sent out
    Current solution:
    The current solution that I have deployed in dev. is the the BADI where I am identifying the segments (E1PLOGI, E1PITYPE and E1xxxx) which has object P and PERNR in the number range. Once caught the segment is being deleted from the internal table.
    Looking for Solution:
    As this does not fall into the best practices category I was wondering if
    1. There is an option to restrict change pointers from being created for the PERNRs in question
    2. There is an option to restrict the IDOC generation program from picking up change pointers for the PERNRs in question
    Please let me know if there is an development option if not configuration.
    Thanks,
    Prashanth

    Found the solution: Below is an example of how to do this. I just defined one filter and modified the BADI.
    The method FILTER_VALUES_SET determines the values FILTER1 and FILTER2, which you can use in the segment E1PLOGI for data filtering.
    Example:
    The following example explains the steps you need to take to enable filtering according to an employee's last name and company code.
    The example assumes that neither the last name nor the company code of an employee changes over time. A complete, time-dependent solution is always customer-specific and cannot be offered in the standard.
    1. Define Filters as ALE Object Types
    Using the Define ALE Object Types function (transaction BD95 (table TBD11)) you create the ALE object type ZFILTER1 for the table name T77ALE_FILTER1T and field name FILTER1.
    In the same way you create the ALE object type ZFILTER2 for table name T001T and field name BUKRS.
    ALE Object Type     Table Name     Field Name
    ZFILTER1     T77ALE_FILTER1T     FILTER1
    ZFILTER2     T001T     BUKRS
    2. Assign Filters to a Message Type
    Using the Assign Object Type to Message Type function (transactionBD59 (table TBD10)), you assign the following values to the ALE object types ZFILTER1 and ZFILTER2 for the message type HRMD_A:
    ALE Object Type     Segment Type     No.     Field
    ZFILTER1     E1PLOGI     1     FILTER1
    ZFILTER2     E1PLOGI     1     FILTER2
    The ALE object types entered here can be used as a filter in the distributionmodel (transaction BD64). As input help for ZFILTER1 you get the value table T77ALE_FILTER1T, and for ZFILTER2 value table T001T.
    3. Determine the Values FILTER1 and FILTER2
    So that ALE outbound processing can evaluate the fields, the fields E1PLOGI-FILTER1 and E1PLOGI-FILTER2 in the IDoc must be filled with values.
    These values are determined in the current interface, which is called up in outbound processing and in which the fields FILTER1 and FILTER2 are filled with customer-specific values.
    4. Distribute Data According to the Filter Settings
    In this example, table T77ALE_FILTER1 contains two entries:
    A-L (for people with last names between A and L)
    M-Z (for people with last names between M and Z).
    The BAdI fills the FILTER1 field according to the infotype field P0002-NACHN with the value A-L if the last name is between A and L, or with the value M-Z if the last name is between M and Z.
    The BAdI fills the FILTER2 field with the company code from the infotype field P0001-BUKRS.
    In the distribution model the following filters are set up:
    To system A: Filter 1 = A-L, Filter 2 is empty
    To system B: Filter 1 = M-Z, Filter 2 = 0001, 1001
    People with last names between A and L, of whichever company code, are distributed to system A. Peoplewith last names between M and Z, who are assigned to company code 0001 or 1001, are distributed to systemB. Note that in this example people with last names between M and Z who are not assigned to company code 0001 or 1001 are not distributed.

  • HRMD_A idocs sent and processed, but Org chart doesn't reflect it sometimes

    From our HR box, we send HRMD_A idocs to our FI box - We have an intermittent issue where we have a position change & the idoc gets created & sent over.
    On the inbound side, the idoc gets posted successfully with Status 53. However, this EE does not appear in the org chart as it does on the HR side.
    We have these IDocs set up to process immediately, so it passes to the FI port and is processed all within a few seconds.
    We come across this issue once about every 3-4 weeks and I can't seem to get anywhere trying to resolve it.
    any ideas?

    There are two main annoying problems with Flash Builder and swc's:
    1.  Flash Builder doesn't reflect changes made to swc files
    Solution : First of all - keep your swc files inside your project folder, you can also disable global swc cache: change as3api.cpp code,refresh swc in flashBuilder bug!
    2. Flash Builder breakes content inside swc files (and also popular problem "TypeError: Error #1034: Type Coercion failed: cannot convert flash.display::MovieClip@1f21adc1 to...")
    Solution : locate the project .actionScriptProperties file and set useFlashSDK=”false” :  Flash Builder 4.7 – useFlashSDK | In Flagrante Delicto!
    In this case you can also download latest Apache Flex SDK to target new Flash player versions: http://flex.apache.org/
    Hope this helps!
    P.

  • Problems at sending IDOC from XI to R/3

    Hello everybody, I have a problem sending an IDOC (WPUUMS) from XI to R/3. Everytime the ERP system presents overflow problems at the locked table, the IDOC, send by the XI, is not created. The type of the sending message is asynchronous. I tried to use the ALE configuration but it doesn´t seem to work.
    I hope anyone can help me to figure out this issue.
    Thank you.

    Julio
    Please try this...
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f6d2d790-0201-0010-9382-b50b499b3fbe?quicklink=index&overridelayout=true
    http://help.sap.com/saphelp_nw70/helpdata/en/44/a1b46c4c686341e10000000a114a6b/content.htm
    Check the Queue [SMQ2] & if the message got struck in queues clear the queues
    Then goto SMQR - De-register & register the queues.
    I hope it helps you ...
    Thanks,
    Vasanth.

  • ** SMQ2 - Object is locked by PIAFUSER - Message Has Errors

    Hi Friends,
    In SXMB_MONI, for a particular message, in the Queue column, we have 'red color' time bomb symbols. It shows 'Message Has Error'. When we double click the queue. we can see an error message 'Object is locked by PIAFUSER'. So, how to solve this problem and restart the message ?
    Kindly help friends.
    Kind Regards,
    Jeg P.

    Hi
    1)  clear the queues in SMQ2,first try to activate them,if that doesn't work them delete them
    2)You may need to re-register your queues.
    Follow SXMB_ADM -> Manage Queues -> Deregister Queues
    Afterwards, re-register the queues and activate from the same screen
    3)Only use SMQ1 (outbound queues) and SMQ2 (inbound queues) if re-registering doesn't fix it. You will need to give the queues sometime to get activated so be patient.
    4)From SXMB_MONI go to Queue ID of the message, a click here shall take you to qRFC monitor.
    if the messages are stuck, you should see an entry indicating number of messages stuck in the queue.
    a double click on the entry shall take you to detailed list.
    if the messages are stuck because of any error (other than queue not registered etc for e.g. a message failure) then the first message shall show you that.
    if indeed you find a message stuck in the queue then the way is to delete the stuck message (DONT DO THIS IN PRODUCTION SERVER without being sure of what does this mean for that particular process !!) and unlock the queue from previous screen.
    5)Go to smq2 and execute F8. Then it should come as Nothing was Selected.
    If any queues are present then open a queue.if the status is SYS ERROR check the entries,right click on the status text and give save LUW.This brings queue to READY. .
    GoTo-> Qin Scheduler: check for the scheduler status to be inactive->Edit-> Activate Scheduler: status frm inactive to starting to wait.Now the queue will be RUNNING.U can see the entries in the queue moving.
    Now go to SMQ3, right click on the queue and give Restore LUW.
    6)this basically means tht message is waiting in the queue.
    in sxmb_moni go to queue Id column and double click on the entry .It will take u to SMQ2.Check the status of the queue
    If its SYSFAIL or STOP then double click on it and try to correct the error.After correcting the error send the message again.Also check whether the Queeu is Registred in transaction SMQR.
    7)Select your queue form the sxmb_moni, you will reacht he QRFC monitor.
    Here you will see some lock figures on the , Select on the 'Unlock Queues'. Activate your queues once again.
    This is if your data is correct and queue is stuck. You should only delete the queue if it has errored out due to incorrect data and you need to urgenetly process the rest of the data in the queue.
    If you drill down one step more than selct the error message and on the menu select edit-->Save LUW.
    thanq
    krishna

  • How to handle the Idoc which is locked

    Hi Experts,
       As per my requirement ,I am triggerring an inbound error Idoc incase when some updation fails for 'VA42' T.code and I have also developed the inbound functional module for reprocessing the error Idoc which has been triggered through my report program.
    Actually the problem I am getting now is
    - The user wants to see the error idoc number.Hence I am displaying the Idoc number in the report output ( i.e in list ) as well.
    - If I try to reprocess that Error idoc number using 'BD87'  It is not calling the inbound functional module.While debugging I came to know that it's raising the exception has ' Idoc && is currently locked '.
    But when I come out of the report output ,BD87 is calling the inbound FM.
    Pls give some suggestions ,So that I can reprocess the Error idoc ( status : 51 ) without coming out of the report output.
    Thanks in advance.
    - Srinivas.

    Hi ,
    if the data is locked the idoc will be in Pratially posted(status 52 ) or not posted ( status 51)..
    so run the Program RBDMANI2 for status 51 & 52..
    or
    go to t-code swo1-->IDOCSTATUS --> enter idoc number execute ..
                                                        first run method -->Processing of IDoc Containing Errors
                                                        next run method-->IDOC.StatusProcess
                                                        and chage the status from 51 or 52 to 53....
    See the SAP Note : - 898626
    http://help.sap.com/saphelp_nw04/helpdata/en/75/4b3c1cd14811d289810000e8216438/frameset.htm
    Regards,
    Prabhudas

  • Object moving locked in marketing plan

    Hello,
    We have created a marketing plan with a specific nomenclature.Then we have released it so that the budget can be transfered to BW and then to R/3.
    But it shows the status as Object moving locked where as it should be with status approved and released.
    Will there be any problem in future if the marketing plan is of staus object moving locked.
    Regards,
    Divya

    Hi Cristina,
    Check into Customer Relationship Management->Marketing->Marketing Planning and Campaign Management->System Landscape->Define ERP Integration Type
    I hope that helps.
    Best regards.

  • BAPI_POSRVAPS_SAVEMULTI3 error Object is Locked at present

    HI,
    Im using the BAPI   BAPI_POSRVAPS_SAVEMULTI3.
    If i give more no.of records(Purchase Requesitions) for Requirements & Reciepts its not updating any values and throwing an error : Object is Locked at present.
    So, i have segregated the orders  depending on Recieving Location and used the BAPI:BAPI_POSRVAPS_SAVEMULTI3.
    Its Updating for some locations, but again throwing same error.
    Is der any constraint ? Im not having any clue about it....
    If i execute the same for single location its working fine...
    Please suggest me..
    Thanks in advance....
    Regards,
    Dayakar

    Hi Manish,
    Actually,these locks are not because of any other User login..its because...
    For Example:
    I have locations : loc1 loc2 loc3 loc4....
    so im looping on locations..and executing the BAPI...its executing fine for some locations for example loc1,loc2 and giving error(no further execution) for other..when i checked with BASIS people..there its showing an error Locktable Oveflow in the livecache...
    So i have gone though the problem description its saying Lock table overflow...so i told them to chage the table size...they indcreased it and now its working fine...Proble got solved..
    Thanku for ur response....
    Regards,
    dayakar
    Edited by: Dayakar Chirivella on Sep 19, 2008 9:46 AM

Maybe you are looking for