Data Transfer Between HR and Erec

Hi all,
We are using having 2 backend systems 1 for HR other than Erec and another one for erec.
Now we have setup the ALE data transfer using Message type HRMD_ABA.
In erec we have only ERECRUIT Component deployed in backend,no EA HR or SAP HR component deployed.
MEaning no PA* tables.
Now when Transfer personnel number(HR!001-A008 with Position) to erec box,it says no pernr exist.
Howver i cannot tranfer the pernr from ECC to Erec as we dont have PA* tables in Erec.
Can you share me the object types that are transferred from ECC to EREc and EREC and ECC.
What are the components to be deployed in backend Erec box.
In Portal for requistions application which should backend(System Object) and for recruiter and recruitment admin applications which should be the backend system(System Object).
Thanks,
Nachy

Hi ,
1) First get your users created with roles in your Erec system
Example recruiter ,Manager ,Recruitment Admin
2) Get your Structure created in your HR back end system .
3)Make sure you maintain your basic Infotypes with 0105 miantained (System user name /Email id )
Assign the user to 0105 subtype 0001
4)ALE that you have setup first move Org structure moved
  next move the person (Pers no ) .
ALE chose the insert mode and goto WE10 if yoru IDOCS are clean without errors .
Make sure all the personal data including address ,postal code is all maintained .
Let me knwo if it helps
Regards
Santosh

Similar Messages

  • Secure the file/data transfer between XI and any third-party system

    Hi All,,
    I would like to use to "secure" SSH on OS Level the file/data transfer between XI and any third-party system Run OS Command before processing and OS command After processing. right now my XI server installed on iSeries OS.
    with ISeries we can't call the Unix commands hope we need to go for AS400 (CL) Programming. If we created the AS400 programm how i can call that in XI.
    If any one have idea pls let me know weather it will work or not.
    Thanks in adavance.
    Venkat

    Hi,
    Thanks for your reply.
    I have red some blogs like /people/krishna.moorthyp/blog/2007/07/31/sftp-vs-ftps-in-sap-pi to call the Unix Shell script in XI.
    But as i know in iSeries OS we can write the shell script we need to go for AS400 programe. If we go with AS400 how we need to call that programe and it will work or not i am not sure there i need some help please.
    Thanks,
    Venkat

  • Most efficient data transfer between RT and FPGA

    This post is related to THIS post about DMA overhead.
    I am currently investigating themost efficient way to transfer a set of variables to a FPGA target for out application.  We have been using DMA FIFOs for communications in both directions (to and from FPGA) but I'm recently questioning whether this is the most efficient approach.
    Our application must communicate several parameters (around 120 different variables in total) to the FPGA.  Approximately 16 of these are critical meaning that they must be sent every iteration of our RT control loop.  The others are also important but can be sent at a slightly slower rate without jeopardising the integrity of our system.  Until now we have sent these 16 critical parameters plus ONE non-critical parameter over a DMA to the FPGA card.  Each 32-bit value sent incorporates an ID which allows the FPGA to demultiplex to the appropriate global variables on the FPGA.  Thus over time (we run a 20kHz control loop on the RT system - we have a complete set of paramaters sent @ approx. 200Hz).  The DMA transfers are currently a relatively large factor in limiting the execution speed of our RT loop.  Of the 50us available per time-slot running at 20kHz approximately 12-20us of these are the DMA transfers to and from the FPGA target.  Our FPGA loop is running at 8MHz.
    According to NI the most efficient way to transfer data to a FPGA target is via DMA.  While this may in general be true, I have found that for SMALL amounts of data, DMA is not terribly efficient in terms of speed.  Below is a screenshot of a benchmark program I have been using to test the efficiency of different types of transfer to the FPGA.  In the test I create a 32MB data set (Except for the FXP values which are only present for comparison - they have no pertinence to this issue at the moment) which is sent to the FPGA over DMA in differing sized blocks (with the number of DMA writes times the array size being constant).  We thus move from a single really large DMA transfer to a multitude of extremely small transfers and monitor the time taken for each mode and data type.  The FPGA sends a response to the DMA transfers so that we can be sure that when reading the response DMA that ALL of the data has actually arrived on the FPGA target and is not simply buffered by the system.
    We see that the minimum round-time for the DMA Write and subsequent DMA read for confirmation is approximately 30us.  When sending less than 800 Bytes, this time is essentially constant per packet.  Only when we start sending more than 800 Bytes at a time do we see an increase in the time taken per packet.  A packet of 1 Byte and a packet of 800 Bytes take approxiamtely the SAME time to transfer.  Our application is sending 64 Bytes of critical information to the FPGA target each time meaning that we are clearly in the "less efficient" region of DMA transfers.
    If we compare the times taken when communication over FP controls we see that irrespective of how many controls we write at a time, the overall throughput is constant with a timing of 2.7us for 80 Bytes.  For a small dedicated set of parameters, the usage of front panel controls seems to be significantly faster than sending per DMA.  Once we need to send more than 800 Bytes, the DMA starts to become rapidly more efficient.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

    So to continue:
    For small data sets the usage of FP controls may be faster than DMAs.  OK.  But we're always told that each and every FP control takes up resources, so how much more expensive is the varsion with FP controls over the DMA.
    According to the resource usage guide for the card I'm using (HERE) the following is true:
    DMA (1023 Elements, I32, no Arbitration) : 604 Flip-Flops 733 LUT 1 Block RAM
    1x I32 FP Control: 52 Flip-Flops 32 LUTs 0 Block RAM
    So the comparison would seem to yield the following result (for 16 elements).
    DMA : 604 FLip-Flops 733 LUT 1 Block RAM
    FP : 832 FLip-Flops 512 LUT 0 Block RAM
    We require more FLip-Flops, less LUTs and no Block RAM.  It's a swings and roundabouts scenario.  Depending on which resources are actually limited on the target, one version or the other may be preferred.
    However, upon thinking further I realised something else.  When we use the DMA, it is purely a communications channel.  Upon arrival, we unpack the values and store them into global variables in order to make the values available within the FPGA program.  We also multiplex other values in the DMA so we can't simply arrange the code to be fed directly from the DMA which would negate the need for the globals at all.  The FP controls, however, ARE already persistent data storage values and assuming we pass the values along a wire into subVIs, we don't need additional globals in this scenario.  So the burning question is "How expensive are globals?".  The PDF linked to above does not explicitly mention the difference in cost between FP controls and globals so I'll have to assume they're similar.  This of course massively changes the conclusion arrived to earlier.
    The comparison now becomes:
    DMA + Globals : 1436 Flip-Flops 1245 LUTs 1 Block RAM
    FP : 832 FLip-Flops 512 LUT 0 Block RAM
    This seems very surprising to me.  I'm suspiscious of my own conclusion here.  Can someone with more knowledge of the resource requirements differences between Globals and FP controls weigh in?  If this is really the case, we need to re-think our approach to communications between RT and FPGA to most likely employ a hybrid approach.
    Shane.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

  • Data Transfer between BW and BO

    I have two servers in different networks, there is a firewall between the two networks.One get through to the Internet, and the other one get through to the Enterprise Intranet. BusinessObjects Enterprise XI 3.1 SP3 is setup in one server which can get through to the Internet,and Busniess Warehouse is setup on another server which can only get through to the Enterprise  ntranet. My question is how to deploy BusinessObjects, so that we can transfer the data in BW to BO.

    HI Roland
    When I create a new connection with SAP Business Warehouse, there is a error.
    DBD:canu2019t connect to the SAP BW Server CMALLC:rc=27>Connect from SAP gateway to RFC server failed Connect_PM GWHOST=192.168.1.66, GWSERV=sapgw00,SYSNR=00
    Location SAP-Gateway on host saptest/sapgw00
    Error   timeout during allocate
    Time   Tue Feb 22 15:25:58 2011
    Release  720
    Component SAP-Gateway
    Version  2
    RC      242
    Module  gwr3cpic.c
    Line     1911
    Detail    no connect of TP sapdp00 from host 192.168.1.66 after 20 sec
    Counter  2
    Is it relation with the BOE solution kit? I have some error when Importing SAP Roles in CMC.
    Exception in JSP: /jsp/auth/sapsec_import_role.jsp:22 19: 20: <% 21: String context=secSAPR3ImportRoleBean.getContextPath(); 22: secSAPR3ImportRoleBean.init(request); 23: response.setHeader("Expires", "0"); 24: %> 25: Stacktrace:
    Thank you very much!

  • Data transfer between SAP and SRM

    Hi all,
    I'm new with SRM and like from a help.
    Working with SAP 4.7 and now with SRM 5.0, I need make the two systems if communicate.
    I need send from SAP to SRM the master data, Material, Customer, Material Group, Cost Center.
    I need send the shop car from SRM for create a PR in SAP.
    I need send the documents RFQ, PO and Contract from SAP to SRM. For RFQ need receive in SAP the data of price updated by the customer.
    How to make this conection? There are standard interface for these data? What are these interfaces?
    Thanks,
    Luciano Lessa

    Hello Ashutosh,
    Thanks your help.
    I have one question about replicate of the material master. In step by step say for complete the CRMSUBTAB table with following values:
    User  ObjectName  U/D             Obj. Class   Function  Obj. Type  Funct. Name
    CRM       empty       Download    Material        empty      empty       CRS_MATERIAL_EXTRACT
    CRM       empty       Download    Material        empty      empty       CRS_CUSTOMIZING_EXTRACT
    CRM       empty       Download    Material        empty      empty       CRS_SERVICE_EXTRACT
    In our system is so:
    User  ObjectName  U/D             Obj. Class                    Function  Obj. Type  Funct. Name
    CRM       empty       Download    Material                        empty      empty       CRS_MATERIAL_EXTRACT
    CRM       empty       Download    CUSTOMIZING             empty      empty       CRS_CUSTOMIZING_EXTRACT
    CRM       empty       Download    SERVICE_MASTER      empty      empty       CRS_SERVICE_EXTRACT
    This configuration is correct?
    The field OBJ.CLASS is key of table, so not is possible add three like values. 
    Thanks,
    Luciano Lessa

  • Data transfer between SAP & Java and Vice versa using IDOC Process

    Dear Experts,
            We are working on one of the good requirement related to data transfer between SAP and Java software. Client requirement is, they want to transfer the data in both the ways (from SAP --> Java and Vice versa also).
    In detail is, after sales order creation using one custom program loading plan details will be calculated. Once loading dates are confirmed then, user will release the sales document to transfer the data from SAP to Java using "Outbound IDOC processing". Similarly in that JAVA software some shipment details will be performed, once completed from JAVA software again details needs to be pumped back to SAP as "Inbound IDOC Processing".
    For this fields are already identified from external software SAP and we are looking for the same to perform the steps in SAP.
    At this stage, I need your expert opinion  / feedback how to go  about at this stage.
    Meaning,  
                     1. What are the customizing steps needs to be done in SAP..?
                     2. How to trigger the :Outbound IDOC process" once the documents are "Released" from custom transaction
                     3. How to create the link between SAP and JAVA to transfer the data between these 2 software
                     4. How to trigger the "Inbound IDOC Process" from JAVA software to SAP and how to store the data in SAP
    Experts, please give your feedback in terms of reply or by sending the step by step process to fulfill this client requirement.
    Thanks for your cooperation.
    Regards,
    Ramesh

    Maybe too many open questions in the same document.
    Maybe you should repost a more specific question in a technical forum.
    This looks like a small project where you already know what you want, maybe you should contract a technical specialist so he proceeds to the implementation!

  • I have problem with data transfer between Windows Server 2012RT and Windows7 (no more than 14kbps) while between Windows Sever 2012RT and Windows8.1 speed is ok.

    I have problem with data transfer between Windows Server 2012RT and Windows7 (no more than 14kbps) while between Windows Sever 2012RT and Windows8.1 speed is ok.

    Hi,
    Regarding the issue here, please take a look at the below links to see if they could help:
    Slow data transfer speed in Windows 7 or in Windows Server 2008 R2
    And a blog here:
    Windows Server 2012 slow network/SMB/CIFS problem
    Hope this may help
    Best regards
    Michael
    If you have any feedback on our support, please click
    here.
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.

  • Error 33172 occurred at Read & Write data transfer between two or more PF2010 controller

    Hi,i need to do data transfer between two or more FP2010 controller.e.g. FP2010(A) & FP2010(B).
    FP2010(A) need to transfer the measurement (from its I/O module) to FP2010(B) to do the data analysis.These data transfer should be synchronous btw two controller to prevent data lost.
    From the vi used in the attachment,i encountered some problems at:
    (1) Error 33172 occurred while publishing the data.Can i create and publish data under different item name?
    (2) How to synchronies the read & write btw contorller?
    All controller are communicating with each other directly without the need of a host computer to link them together
    Is there any other method to do fast data transfer betwe
    en controller?

    Hi YongNei,
    You were succesful in omiting enough information to make it very difficult to answer!
    Please post your example.
    Please tell us what version of LV-RT you are using.
    Please define what you concider "fast data transfer".
    Have you concidered mapping the FP tags of FP2010(A) to FP2010(B) and vise versa?
    WHat exactly has to be syncronized?
    If you have something that is close to working, share that.
    Well, that as far as I can go with the info you have provided. Depending on the details, what you are asking could be anything from trivial to impossible with the currently available technology. I just can't say.
    It would probably be a good idea to start over with a fresh question (sorry) because not many people are going to know what a a "
    PF2010" is and I can not guarentee that I will be able to get back to you personally until next week-end.
    Trying to help you get an answer,
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Data streaming between server and client does not complete

    Using an ad-hoc app, data streaming between server
    and client does not complete as it supposed to be.
    The process runs well in solaris 5.8, however under 5.9
    we have found the characters stream buffer length limitation
    is around 900 to 950 characters (by default we are using 3072
    characters).
    Example:
    - We are transfering HTML file, which will be displayed
    in the App client, with buffer=3072, the HTML only displayed / transfered
    as xxxxxxxx characters, but with buffer=900 the HTML is displayed properly,
    in this case, the only problem that we have is the file transfer will
    eventually longer than usual.
    - There is another case, where we have to transfer information (data) as stream
    to the client. A long data stream will not appear at all in the client.
    Any ideas why the change between 5.8 and 5.9 woudl cause problems?
    The current app-driver that we are using is compiled using Solaris 5.6,
    if possible we would like to have use of a later version, which is compiled using Solaris 5.9, do you think this will probably solve our problem?
    Thanks
    Paul

    Does this have anything to do with Java RMI? or with Java come to think of it?

  • Data Replication Between Sqlserver and Oracle11g using materialized view.

    I have Sqlserver 2005 as my source and oracle11g as my target.I need to populate the target daily with change data from source.
    for that we have created a dblink between sqlserver and oracle and replicated that table as a Materialized view in Oracle.
    problem we are getting here is Fast refresh option is not available.each day it will pick full data from the source.
    is there any way to use Fast refresh in this scenario??
    Thanks in advance.
    Regards,
    Balaram.

    Pl do not post duplicates - Data Replication Between Sqlserver and Oracle11g using materialized view.

  • Data mismatch between 10g and 11g.

    Hi
    We recently upgraded OBIEE to 11.1.1.6.0 from 10.1.3.4.0. While testing we found data mismatch between 10g and 11g in-case of few reports which are including a front end calculated column with division included in it, say for example ("- Paycheck"."Earnings" / COUNT(DISTINCT "- Pay"."Check Date")) / 25.
    The data is matching for the below scenarios.
    1) When the column is removed from both 10g and 11g.
    2) When the aggregation rule is set to either "Sum or Count" in both 10g and 11g.
    It would be very helpful and greatly appreciated if any workaround/pointers to solve this issue is provided.
    Thanks

    jfedynic wrote:
    The 10g and 11.1.0.7 Databases are currently set to AL32UTF8.
    In each database there is a VARCHAR2 field used to store data, but not specifically AL32UTF8 data but encrypted data.
    Using the 10g Client to connect to either the 10g database or 11g database it works fine.
    Using the 11.1.0.7 Client to go against either the 10g or 11g database and it produces the error: ORA-29275: partial multibyte character
    What has changed?
    Was it considered a Bug in 10g because it allowed this behavior and now 11g is operating correctly?
    29275, 00000, "partial multibyte character"
    // *Cause:  The requested read operation could not complete because a partial
    //          multibyte character was found at the end of the input.
    // *Action: Ensure that the complete multibyte character is sent from the
    //          remote server and retry the operation. Or read the partial
    //          multibyte character as RAW.It appears to me a bug got fixed.

  • Data sync between oracle and sql server

    Greetings Everyone,
    Your expert views are highly appreciable regarding the following.
    We at work are evaluation different solutions to achieve data synchronization between oracle and sql server data bases. Data sync i mentioned here is for live applications. We are runnign oracle EBS 11i with custom applications and intending to implement a custom software based on .NET and SQL Server. Now the whole research is to see updates and data changes whenever happens between these systems.
    I googled and found Oracle Golden Gate, Microsoft SSIS, Wisdom Force from Informatica....
    If you can pour in more knowledge then it's great.
    Thank You.

    Most of the work involved has to be done through scripts and there is no effective GUI to implement OGG.However using commands is not vey togh and they are very intutive.
    These are the steps, from a high level:
    1.Get the appropriate GG Software for your source and target OS.
    2.Install GG on source and target systems.
    3.Create Manager and extract processes on source system
    4.Create Manager and replicat processes on target system
    5.Start these processes.
    First try to achieve uni-directional replication. Then Bi-directional is easy.I have implemented bi-directional active active replication using Oracle DBs as source and target. Refer to Oracle installation and admin guides for more details.
    Here is a good article that may be handy in your case.
    http://www.oracle.com/technetwork/articles/datawarehouse/oracle-sqlserver-goldengate-460262.html
    Edited by: satrap on Nov 30, 2012 8:33 AM

  • Data transfer between EREC 606 and HR system EHP5

    Hi All,
    we have similar scenario to the discussions:
    http://scn.sap.com/thread/3513535
    http://scn.sap.com/thread/3513421
    with that abbreviation that out system hosting partner has recommended EREC 606 as the right level. The main reason was as I recall that our SAP Portal is on the level 7.31.
    As I see in that discussions others are using EREC 605 with ERP EHP5.
    Our integration between ERP (HR) and EREC will be limited to transfer of job positions to be recruited. In our business model we have payroll outsourcer and the hired persons will be transferred from EREC via SAP PI to the outsourcer and then the outsourcer will send to us the file with new and changed employees and the SAP PI will read them into SAP HR.
    1. are we going to make right landscape? We are now just about to install EREC 606 - will it work with ERP EHP5?
    2. Second question is if we have to activate any business function in SAP HR (ERP)?
    Cheers
    Waldek

    Dear Nicole,
    thank you very much!
    I like to double-check the question no 2:
    - Indeed we have EREC 606 standalone (without ERP component)
    - the SAP ERP system (including HR) EHP5 will communicate with EREC through ALE
    Our hosting partner and application partner are now seriously considering if we need to activate on SAP ERP following BF's:
    HCM_ASR_CI_1
    HCM_MSS_ERC_CI_1
    HCM_HIRE_INT_CI_1
    Your answer means: NO - WE DO NOT NEED TO ACTIVATE THEM (?)

  • Data transfer between OSX 10.9 and WIN 7 stops

    Hello,
    Unfortunately, I have a problem with the data sharing through SMB.
    I can connect easily to my Mac and see shared folders or data. But in attempting to copy files from Mac to Win 7 PC there is an error. The data transfer is terminated after a while. The file used in the test is an MKV with the size of 1.34 GB.
    The Mac is accessed from a Win 7 PC with administrator privileges. The systems are connected via Wi-Fi 802.11 n.
    The exchange of data between OSX in use of SMB or AFP works without any problems.
    Maybe someone has a solution or a tip for me.
    regards

    Hi ip0wn,
    Welcome to the Support Communities!
    The article below may be able to help you with this issue.
    Please click on the link for more details and troubleshooting steps:
    Mac OS X: How to connect to Windows File Sharing (SMB)
    http://support.apple.com/kb/HT1568
    Cheers,
    - Judy

  • Question about transfer between oracle and sql server

    Could i program to transfer lots of data between Oracle and SQL Server quickly?
    I have tried make two connection between two databases, but it took me lots of time to transfer data.
    How could I do it?
    Thanks

    Hi,
    If you need to move data fast, then use the Oracle Migration Workbench to Generate SQL Server BCP data extraction scripts and Oracle SQL Loader files.
    This is the fastest way to migrate the data.
    In the Oracle Model UI tab of the Oracle Migration Workbench, right mouse click on the tables folder. there is a menu option to 'Generate SQL Loader ...' scripts. This will help you migrate your data efficiently.
    Regards
    John

Maybe you are looking for