E Rec integration issue while data transfer

Hi Experts,
In E Recruitment we are facing an issue that when education is getting transferred from E Rec to PA then education is not getting transferred correctly.
The issue is because in E Rec we are using table T77RCF_DEGREE & in PA we are using Table T518a. Challenge is coming in their synchronization.
Please suggest if any workaround is possible.
Rwgards
Puneet

Hi Puneet,
Usually the 'standard way' to execute the syncronization is through the indicated steps in the following link:
SAP Library - Talent Management and Talent Development
This scenario must be recommended to your customer if its using talent management in its HCM system and its in the correct version. Otherwise, if the customer still requires the sync to PA infotypes, I see the following walkarounds: (Depending your scenario, the technology you're using and the moment in which you need to sync the data, if you can provide more detail in this points we can get the best solution)
- Implement a custom RFC with all the logic to read/write/convert the information between infotypes
- Use PI to convert/translate the information
- Modify the logic of the synchronization reports mentioned in the link above.
Hope this helps.
Regards.

Similar Messages

  • Issues with data transfer / connectivi​ty

    Hi,
    I bought a Curve 8320 last week and activated BIS. The issue that I am facing is:
    - Mails do not get delivered and connectivity to the internet is lost even though the data signal (EDGE) is available. I can see an upload arrow flashing on the right top corner
    - What I have noticed is that the connectivity gets resumed when I do any of the following:
    a. Switch on WiFi (immediately the data connection starts receiving and sending messages and internet connectivity is resumed). Even after switching off the WiFi, it works fine
    b. Switch off the device and restart
    Can someone help me? Please let me know if you need any more details.
    Thanks

    You can try using FExplorer and using it to delete the file C:\system\shareddata\101ff93b.ini from the phone. It should reset the Data Transfer app.
    FExplorer you can find here:
    http://www.gosymbian.com

  • Spool error while data transfer

    Hi,
      I am trying to give barcode print, I am using Zebra S600 printer. when I issue the print with new spool request & print immediately, System is generating the spool but with error as Internal Error when printing ,Request on hold
    Error class: delayed
    Area: data transfer
    I tried with SPAD also. How to solve this error.
    Thanks & Regards,
    Karthik.k

    Hello,
    Can u please tell me it is front end printer or remote printer and also please check that u have installed correct Printer driver.....
    if possible then give one demo printout from u r machine instead of from SAP....
    Thanks

  • Issues in Data Transfer Process

    Hello All,
    After creating transformation from Infosource to InfoCube, now i am trying to data transfer process.
    In DTP Type it displays "DTP for direct process", but i need DTP type as Standard(Can be scheduled).
    Its giving me an error as "Source does not support direct access"
    Could any one help me to solve this error.
    Thanks in advance
    Regards,
    Nithin

    Hi,
    You can only use the type DTP for Direct Access as the target of the data transfer process for a VirtualProvider.
    Check the below links
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/fb8ed8481e1a61e10000000a422035/frameset.htm
    Regards
    KP
    Edited by: prashanthk on Jan 19, 2011 5:15 PM

  • Issue with Data Transfer Workbench

    I have a simple user-defined object of a type Master Data based on a table with only three fields (Code, Name, U_Remarks).
    When I try to import data to that object using Data Transfer Workbench, it fails to import any of records and gives an error:
    Created Failed, Unknown error - 1005.
    Do you have any idea on this? Thanks in advance.

    Hi,
    See link below:
    Unable to import data in Master Data Type in User Table
    I don't think you can import master data type UDO.
    Regards,
    Nat

  • EWM CIF Integration Model master data transfer Issue

    Hi Gurus,
    I am not able to transfer the master data from ERP to EWM.
    When I excecute the transaction CFM1 and CFM2 integration model ,the material is not getting transferred.
    I am getting the below mentioned Error message.
    "Location does not exist for external location 9997, type 1040, and BSG CSSCMDEV"
    Please help me in resolving this issue.
    Regards
    Govind

    Hi,
    Thanks for the reply
    I had created the location 9997 for the location type 1001earlier.
    When I tried to delete the location and create new Location for Location type 1040.System is not allowing me to delete.Please let me know how to delete the existing Location.
    Regards
    Govind

  • Authorization Issue while Data Preview from HANA View

    Hi Experts,
    We are using BW on HANA. We have created DSOs (info provider) in BW and generated HANA views from there. We have also created analysis authorizations in BW for authorization relevant characteristics. In HANA, we are able to go to the generated analytic view and preview the data from it successfully.
    Now I have created a test user and assigned a custom role with below authorizations to this user in HANA:
    - bw2hana/../REPORTING role (this role is automatically created by activation of DSO in BW).
    - Roles MODELING, MONITORING, CONTENT_ADMIN, USER.
    - Multiple system privileges although not needed, like REPO.EXPORT, REPO.IMPORT, etc.
    - Analytic Privilege  _SYS_BI_CP_ALL
    - Package Privilege: REPO.READ for all required packages (tried with ROOT package also).
    In BW system also, the test user has analysis authorizations providing access to the relevant info objects.
    But when I am trying to preview data for HANA view, I am getting attached error (also listed below):
    "Cannot get the data provider outline
    SAP DBTech JDBC: [2048]: Column store error: Search table error: [2950] user is not authorized"
    I tried to trace the situation is HANA and got below details in 2 trace files:
    indexserver_alert_saphana.trc:
    [6433]{416977}[66/-1] 2014-10-14 00:59:27.541187 e CalcEngine       ceAuthorizationCheck.cpp(02365) : AuthorizationCheckHandler::addAPsToSearchObject: Error during converting SqlAPs to Query entries
    indexserver_saphana.31003.075.trc
    [6433]{416977}[66/-1] 2014-10-14 00:59:27.541197 i TraceContext     TraceContext.cpp(00702) : UserName=TEST_SSO, ApplicationUserName=<<computer name >>, ApplicationName=HDBStudio, ApplicationSource=csns.modeler.datapreview.providers.ResultSetDelegationDataProvider.<init>(ResultSetDelegationDataProvider.java:118);csns.modeler.actions.DataPreviewDelegationAction.getDataProvider(DataPreviewDelegationAction.java:278);csns.modeler.actions.DataPreviewDelegationAction.run(DataPreviewDelegationAction.java:242);csns.modeler.actions.DataPreviewDelegationAction.run(DataPreviewDelegationAction.java:127);csns.modeler.command.handlers.DataPreviewHandler.execute(DataPreviewHandler.java:53);org.eclipse.core.commands
    [6433]{416977}[66/-1] 2014-10-14 00:59:27.541187 e CalcEngine       ceAuthorizationCheck.cpp(02365) : AuthorizationCheckHandler::addAPsToSearchObject: Error during converting SqlAPs to Query entries
    Do you know what this "Error during converting SqlAPs to Query entries" actually means"? How can we resolve this issue? The authorization is working properly for our user ids. But we need to provide restricted access for business users so trying to create test user and custom role.
    Thanks
    Nitesh Gupta

    Hi Pinaki and Prabhith,
    Yes, my issue was resolved. Sorry, missed to updated here.
    I was just a beginer for BW on HANA Security at that time and didn't know many small things. The solution was pretty simple.
    Whenever you assign analysis authorizations to a user in BW, you also need to generate corresponding HANA authorization. This is done through tcode RS2HANA_CHECK tcode. This tcode converts  BW analysis authorizations into HANA analysis authorizations and assign to the HANA user. You can see generated HANA authorization table RS2HANA_AUTH_STR in both BW and HANA.
    Once the HANA authorizations are successfully generated for a user, he should be able to see data from Views.
    Let me know if this solves issues. Then I will close this thread.
    Thanks

  • Issues in data transfer from XI

    Hi all,
            We tried to send data from 3rd party system to SAP through XI.
    The XI is calling an RFC function module to pass data into SAP.
    After sending the data, we checked that the data in SAP looks different from the data sent through XI.
    when the same data has been tried to sent again from XI to SAP, this time the data was similiar.
    No changes have been made to either XI(mapping) or in RFC function module.
    Any way if we can check where the problem was so that the same issue doesnu2019t repeat the next time.
    Any suggestions/inputs on this issue would be appreciated.
    Thanks & Regards
    Gautam

    check in XI side all the data are coming and mapping rules in XI side , once the XI system get proper data then you can check in SAP side....

  • CRM to ECC: Controlling Integration Issue while saving Service Transactions

    Hi,
    We have configured all CRM service functionalities with ECC integration. We have also set up controlling integration following best practices document.
    We are getting the following error whenever we try to save any service transaction (Service Contract or  Service Order).
    u201CAn error occurred in system XXXXECC during account assignmentu201D
    u2022     Errors in prerequisite object Item 2000000042/100 (Notification E IAOM 026)
    u2022     Collector posting not allowed on account (Notification E YAM_RE 013)
    Please help to resolve this issue.
    Thanks,
    Prish

    Hi Ashish,
    Could you please tell me how do we assign the debit memo request to the internal order.
    I am creating a debit memo request from the confirmation. the debit memo request doesnt have the internal order number that was created in ECC. And hence, I am not able to post the revenue from the confirmation to the internal order.
    Could you please help.
    Regards,
    Itisha

  • Character integrity issue after data conversion in database/JDBC

    Hi
    I am using oracle 9i with the following NLS setting:
    NLS_LANGUAGE :AMERICANS
    NLS_CHARACTERSET : UTF8
    NLS_NCHAR_CHARACTERSET :AL16UTF16
    I am running on Linux with this as my environment Language:
    Lang: en_US.UTF8
    I am sending hindi characters in XML file (UTF-8 encoding) to my java application to be stored in the database. In my xml file, I give this encoding (ignore the double quotes, reason for putting in the quotes so that the browser will not interpret it)
    "&#x928";"&#x92E";"&#x938";"&#x94D";"&#x924";"&#x947"
    But the characters appeared unreadable in the database. When I use Select DUMP to check the characters encoding:
    Typ=1 Len=12 CharacterSet=UTF8: 0,28,0,2e,0,38,0,4d,0,24,0,47
    When I retrieve data from the database via my application, the weird characters will appear.
    However, if i manually input the hindi characters into the column of the table, then the Hindi characters appear correctly. When I do a DUMP to check, this is what I get:
    Typ=1 Len=12 CharacterSet=UTF8: 9,28,9,2e,9,38,9,4d,9,24,9,47
    When I check the unicode chart here http://www.unicode.org/charts/PDF/U0900.pdf, the second DUMP result is correct. When I retrieve data from the database via my application, the correct hindi string appear.
    I understand that in Java the encoding is in UTF-16 and Oracle JDBC will convert from UTF-16 to UTF-8 before storing in my database and vice versa. The thing that puzzles me is why correct hindi string appears on my web interface when that the same conversion is used to extract the data from the database. At first I suspect it is the conversion problem in JDBC when the UTF-16 characters get truncated to UTF-8 when I try to store the data to database. But when good data is stored in the database, the extraction seems to be correct albeit that it is going through the same conversion.
    I read from several threads of this forum and also the Oracle Globalization Support article but I cannot find an answer to my question.
    Can anyone help? Thanks.
    Edited by: user13085722 on May 10, 2010 1:12 AM
    Edited by: user13085722 on May 10, 2010 1:16 AM

    A couple of checkpoints for you:
    1. When you load the XML from SXMB_MONI in the test tab of message mapping it turns red..this means the constructed XML (from CC content conversion) doesnt match the one (XSD) defined in your ESR/IR. In this case you have to check again thoroughly the file content conversion fields values/field length in the sender Communication chaneel.
    2. Once you rectify the error above then you can test the mapping in ESR message mapping.

  • Error in the Source system while data loding through Process Chain

    Hi,
    I am facing issue while data loading for certain extractors through Process chain. Load of 0BPM_WIHEAD, 0BP_LOGHIST, OBPM_OBJREL, 0BPM_DEADLINES(there are 2-3 more extractors) is getting failed daily with message Error occurred in the source system and if we repeat the same it is getting executed successfully.
    Regards,
    Javed

    Hi..
    It means that the extraction job is failing in the source system..
    Please check the Job log in the source system  --> Copy the request nimber --> In the source system go to SM37 --> Give * then the request number......execute --> Check the Job log..
    Look there may be multiple reasons behind this....
    1) Connection problem : Go to SM59 and test the connection, If you don't have the access in SM59...then go to RSA1 --> Source Systems --> Right click --> Connection Parameters --> Connection Test
    2) Or may be work processes are not available and due to which jobs are failing due to time out...
    In both the cases you can check with the Basis Team if they can suggest something....or you can change the process chain scheduling time if possible, and schedule it in such a time when less no of jobs are running..
    Regards,
    Debjani..

  • E Rec data transfer &  Integration with PA

    Hi Gurus,
    As per my understanding E Rec & PA integration is diff concept and Data transfer from E Rec to PA & vise versa is separate.
    We have E Rec on same instance as ECC. and Client needs to have all e rec object data available lifelong and transfer most of the data to PA from E Rec.
    Can you please illustrate on Integration and data transfer concept.
    Regards

    Thanks Nicole for your inputs.
    Need to confirm some more unsderstanding
    - where is master data of E rec stored?  and is thr any storage period to be define for the data.
    I understand that attachements are stored in DMS and we can define the storage period for the same but what ant the object data, activity record data, etc...
    Ext cand is attached to CP to BP but where are these stored ECC or whr?
    - Regarding integration aspect I was told that PSA will be mapped with location in E rec, Ed fields in PA will be available in E erc is it correct? I have to take requirement for each object from client eg.-functional area list, industrry etc..so do I need to consider any PA integration data which will be available in e rec and jst need to add the list...
    -  When job posting is viewed by external candidate does it hit ECC for details?
    - How to consider Virus scan for documents attached..
    Regards

  • Rec - data transfer

    Hi
    While transferrring data from recrutiment to PA, i want only those infotypes which are updated by user in recruitment to be transferred to PA(apart from the mandatory infotypes). I am using operation MOD still the IT updated in recrutiment does not pop up during data transfer.
    Please help me in this regard
    Regards
    Minal

    Thanks Nicole for your inputs.
    Need to confirm some more unsderstanding
    - where is master data of E rec stored?  and is thr any storage period to be define for the data.
    I understand that attachements are stored in DMS and we can define the storage period for the same but what ant the object data, activity record data, etc...
    Ext cand is attached to CP to BP but where are these stored ECC or whr?
    - Regarding integration aspect I was told that PSA will be mapped with location in E rec, Ed fields in PA will be available in E erc is it correct? I have to take requirement for each object from client eg.-functional area list, industrry etc..so do I need to consider any PA integration data which will be available in e rec and jst need to add the list...
    -  When job posting is viewed by external candidate does it hit ECC for details?
    - How to consider Virus scan for documents attached..
    Regards

  • Error while executing Initialize with Data Transfer for 0FI_GL_10

    Hello All,
    Post Pre prod refresh, our timestamp for the datasource 0FI_GL_10 got reset. Due to which our deltas did not bring any records to the BW system.
    First we did an Initialize without data transfer for all the BW related datasources. The deltas were then set properly for all datasources except 0FI_GL_10.
    We then raised a message to SAP and they suggested to run the 'Initialize with data transfer' for 0FI_GL_10 so that the timestamp is set and accordingly the deltas are fixed.
    The issue now is we are getting the following error message while running INIT with data transfer.
    Job terminated in source system --> Request set to red
    Message no. RSM078
    We have copied the data till July 1st week of 2011.
    Please advice. The issue is very critical.
    Thanks & Regards
    Sneha

    Hi Arvind
    Thanks for your inputs.
    Please find below the details of the short dump.
    Runtime Errors         DBIF_RSQL_SQL_ERROR
    Exception              CX_SY_OPEN_SQL_DB
    Date and Time          09/07/2011 11:25:32
    Short text
         SQL error in the database when accessing a table.
    What can you do?
         Note which actions and input led to the error.
         For further help in handling the problem, contact your SAP administrator
         You can use the ABAP dump analysis transaction ST22 to view and manage
         termination messages, in particular for long term reference.
    How to correct the error
         Database error text........: "ORA-01652: unable to extend temp segment by 128
          in tablespace PSAPTEMP"
         Internal call code.........: "[RSQL/FTCH/FAGLFLEXT ]"
         Please check the entries in the system log (Transaction SM21).
         If the error occures in a non-modified SAP program, you may be able to
         find an interim solution in an SAP Note.
         If you have access to SAP Notes, carry out a search with the following
         keywords:
         "DBIF_RSQL_SQL_ERROR" "CX_SY_OPEN_SQL_DB"
    Information on where terminated
        Termination occurred in the ABAP program "GP_GLX_FAGLFLEXT" - in
         "FETCH_TO_ISTRUCTURE".
        The main program was "SBIE0001 ".
        In the source code you have the termination point in line 903
        of the (Include) program "GP_GLX_FAGLFLEXT".
        The program "GP_GLX_FAGLFLEXT" was started as a background job.
        Job Name....... "BIREQU_4N3PZQ12IA0X0PYGEA85IG39S"
        Job Initiator.. "BIWREMOTE"
        Job Number..... 11203300
        The termination is caused because exception "CX_SY_OPEN_SQL_DB" occurred in
        procedure "FETCH_TO_ISTRUCTURE" "(FORM)", but it was neither handled locally
         nor declared
        in the RAISING clause of its signature.
        The procedure is in program "GP_GLX_FAGLFLEXT "; its source code begins in line
        840 of the (Include program "GP_GLX_FAGLFLEXT ".

  • SRM MDM: Workflow Unlaunch while performing the Automatic data transfer

    Hi,
    We are trying to import some data from R/3 4.6 C by configuraing remote system as ERP and creating Port based based on the XML Schema in the SRM MDM Catalog. We have created work flow to validate the above pulled data accuracy into data manager. Everything goes well here, I am able to get the data transfered into data manager Automatically But the only issue is in the Work flow.
    Set workflow into data manager is not able to launch automatically even though the record is accurate via automatic transfer. I have accuratly set the wokflow name in import manager while svaing the MAP file for automatic import.
    Please have your inputs why the work flow is not able to launch in automatic data transfer from the specified port.
    Thanks/Pawan

    Hi Pawan,
    please check in the Data Manager, that you selected by the workflow, the right Trigger Actions, such as Record Import. And that the Autolunch is also set correcly.
    Regards,
    Tamá

Maybe you are looking for