DEV to QA transfering

hi experts,
when we are transfering the infosource for
dev r/3 to production r/3
how to transfer & what are the changes we have to do
revert back pl
venu

Hi,
Transfer/save all your required objects into an transport request - make sure that
all required objects are captured in the TR, then go to T-code, SE09 and release
the TR along with the Tasks below it, and then go to T-code STMS and refresh
the view and you'll find the your TR, select that, request and use transport button
where in you choose required system next in the Trnasport Layer, and say import
..if you have quality system, then move it first to QA and again from the repeat the
above acitivity and do the import again for PRO..
that way you can transfer your objects from DEV--->PRO..
Hope it helps...
Assign points if useful..
Cheers,
Pattan.

Similar Messages

  • How to create a new request for function group?

    Hi Friends,
    My client has changed their development server. Now all the objects in old DEV server are transfered to new one. Except for function group objects, we have created new change request number for all others. Now I need to create a new request for function group objects. Can anyone suggest which transaction code I have to use for this purpose?
    Thanks,
    Kaarthick.V

    Hi Karthick,
    If you want to Change the request for your FG means,
    1. Go to SE80 ->  work bench ->  Edit object.
    Here you have the option two change the request.
    or If you want to create the Function Group means,
    Go to SE80 -> Select Function Group -> Enter name -> press Enter
    Thanks,
    Reward If Helpful.

  • Transport request in quality server

    Hi everyone..never mind if this is not the correct place to post this question..i have the access to ECC QUALITY SERVER..i tried to generate the datasources in ECC Quality server..it has given PROMPT FOR LOCALWORKBENCH REQUEST pop-up window
    when i look for the available requests through F4(input values)..there are no input values..
    Dont we have the possibility to activate datasources in quality server?why so?and if anybody knows details about development,quality and production server could you please give me a broad explanation??im confused with these three types of servers and how they work!!
    Thank you very much in advance for any help,
    Edited by: yarpath on Sep 22, 2011 11:21 AM

    Hi ,
    Along with provided details Some info about server .
    In SAP Landscape we have a server system called SAP architecture
    DEV, QAS and PROD.
    -  DEV would have multiple clients.
      190- Sandbox -Sandbox doesn't affect the other servers or clients its standalone system
      100- Golden- dev and changes transferred to next level via transports it contains all configuration and master data .So all the configuration settings are done in golden clients and then moved to other clients. Hence this client acts as a master copy for all settings to be done so called "Golden Client"..
      180- Unit Test- if configures then Transports from golden client will come here .
    - QAS may again have mutiple clients
      300- Integration Test
      700 to 710 Training.
    -  PROD usually have one client
    this depends on how is the client's business scenario.
    Request flow like
    DEVELOPMENT -> QUALITY -> PRODUCTION
    DEVELOPMENT : where developer create new objects and flow or change existing one .
    QUALITY : The objects moved from Dev tested as it has large amount of data and usually a copy of production so new development get tested with real time data and any bugs etc found can be fixed back in dev and get transported again .
    PRODUCTION : Day to day business data is present here coming from different systems (R/3 ,flat file ,NON SAP system etc).
    --In quality or producation you cannot see your transport request and it will not allow you to change object .
    --Usually we have only access to create DTP ,Process chain ,Program etc in Production but we cannot change transfromation etc as the flow can not be disturbed .
    --To see the status of your objects in any system which are moved via transport you need to see logs in Dev system .
    --this will also tell you which systems are connected to your dev system .
    Hopw this will  be helpful .
    Regards,
    Jaya Tiwari

  • Transfering apps from a personal dev account to an enterprise account

    Has anyone had an experience with the subject?
    My friend initially created several iPhone and iPad apps under his personal dev account and is ready to move on... He is thinking of registering a company and creating an enterprise account but is concerned with how he will be able to transfer all the apps... How long that would take, will the apps lose their ratings, will he still be able to notify existing users with the new versions, etc.... Should he transfer individual apps or there is an option of transferring the entire account...
    Please share your experience or point me to a right source... My friend is apparently shy to ask himself, but I figured that pretty soon I will be facing the same question myself....
    Thanks everyone for your responses

    I will tell you my experience with this.
    I'm doing some work for a stealth statup company.
    They registered everyting under the company name.
    They wanted to change the company name and re-release the app under a new name.
    In effect, they had to "start over".
    New developer account, new iTunes Connections accounts, etc.
    It seems there is no "easy" way for an entity to simply change it's name.
    So, for your friend, I would say, leave things already in the app store in his name.
    Release new versions under the new name.
    This might not work easily, as he may have to abandon / remove the old apps from the app store in order to release the app with the same name in the app store under a different company/entity name.  Or maybe not.
    I'd advise him to just call Apple Developer Support and ask....
      "Can I re-release my current apps as a new version with the same name using a different compay name?"
    The details being the app bundle will be different, com.old.myapp  vs. com.new.myapp, as well.
    The real answer will be if Apple allows the name-space collision from two entities.
    In our case, our app and company name were the same, and when the company name changed, they made them separate, and also picked a new name for the app taking the product in a new marketing direction.
    In your friend's case he wants to keep his popularity and ratings, and I don't know how Apple handles that part.
    The question is more, can I transfer the ownership of my app to another entity without losing it's reatings?

  • Issue in Data transfers using ALE for E-Recruiting standalone system

    Dear Experts,
    We have an issue in data transfers using ALE. We are having standalone system where E-Recruiting is maintaind seperately and the version is EHP5.
    When we configured iDoc in Sandbox and its working fine. But when it comes to Dev and Quality servers this is not working fine.
    we have created iDocs using the following filters.
    For Object P: Infotypes 0000, 0001, 0002, 0003, 0006, 0024, 0105.
    For Objects O,S: 1032, Subtype: 0010
    For Object: 1001, Subtype: A003, A008, A012, B007
    For Object O: 1001, Subtype: A002, B002, B003, B012, Tyoe of related object: O,S
    For Objects: C, O, S, Subtype: 1000, 1002, 1028
    For Object:C, Subtype: 1001, Subtype: A007
    When we do the data transfers in Sandbox using PFAL it is transferring the data properly.
    But when we do the same iin Dev and Quality servers we are not able to do it and gettign 52, 51 errors.
    Here we are facing a strange issue. For some Users when we transfer data it is transferring data but with 52 error messgae.
    For some users I am getting 51 error where i could not transfer any data.
    For some users, when we tranfer O, S, and P related data is moving but getting mostly 52 error and all the relationships are not moving here for O,S and properly. But P related CP, BP and NA are getting created successfully.
    It is happening only in Dev and Quality server and that too some users are able to get transferred but not all.
    Where will be the issue? How can we resolve this?
    Any help will be appreciable.
    Thank you in advance.

    Hi Sekhar,
    Have a look over these threads.
    Errors in ALE data transfer to E-Recruiting.
    http://scn.sap.com/thread/1535402    
    BR,
    RAM.

  • [solved] transferring (migrate) my install to a new hard drive

    I did some googling, but could not find a howto which replicated my situation...
    I bought a new SSD for my netbook and I am wondering what the best way to go about transferring my arch install to it without going thru an install-from-scratch would be...
    I'm planning to partition the new drive with / (10GB) and /home (the rest) in separate partitions.
    I bought a usb hdd external enclosure and plan to connect the SSD to it to make the changes.
    my old hard drive is 160GB and the new one will be 30GB, so I'm guessing dd would not work well? Or rather, I am not sure how to use dd in this situation (in part because some kind of "partition alignment" is needed for SSDs, which I guess dd would break? and the size mismatch would cause me trouble?). any pointers?
    I'm guessing rsync would help me in this situation (I'm more familiar with it as I do my backups via rsync -av --delete --checksum), but then, how do I transfer grub to the new drive, so that when I (internally) connect the SSD and boot, all I have to do would be to wait as it boots to *my* Arch without me having to do anything.
    also, if I do an rsync, which files would I need to change so the thing boots ok? I know I will have to change fstab entries to get the id of the partitions right, but what else?
    any ideas, or pointers, or links would be very much appreciated.
    thanks and sorry for this newbie question...
    update: solved using the following:
    pyther's comment below:
    ) Boot Live CD
    2) Partition New Drive
    3) mount old drive and all partitions /boot, /home, etc... (/mnt/old)
    4) mount new drive and all partitions /boot, /home, etc... (/mnt/new)
    5) Use cp or any tool that allows you to transfer files, make sure to preserve permissions. I would suggest the following command cp -av /mnt/old/* /mnt/new/*
    6) Mount /proc, /dev, /sys (mount -t proc none /mnt/new/proc; mount -o bind /dev /mnt/new/dev; mount -o bind /sys /mnty/new/sys)
    7) chroot /mnt/new /bin/bash
    8) Modify any configuration files (ex. fstab and menu.list)
    9) Rebuild initrd image mkinitcpio -p kernel26
    10) Install grub
    11) unmount everything
    12) reboot
    panuh's link
    Quick search gave me this: http://www.linuxjournal.com/article/10093
    things to be careful about: (1) grub's menu.list has both uuid and hd(X,Y) fields that need to be updated, (2) if you use an ubuntu live cd and chroot's grub-install doesn't work for you, ubuntu comes with grub2. when it craps on you during boot, see http://wiki.archlinux.org/index.php/GRU … ue_console . using this, after booting to arch, use grub-install along with --recheck http://www.gnu.org/software/grub/manual … stall.html
    Last edited by vajorie (2010-03-10 23:41:24)

    Graysky covered fairly nicely I'll add my own notes...
    1) Boot Live CD
    2) Partition New Drive
    3) mount old drive and all partitions /boot, /home, etc... (/mnt/old)
    4) mount new drive and all partitions /boot, /home, etc... (/mnt/new)
    5) Use cp or any tool that allows you to transfer files, make sure to preserve permissions. I would suggest the following command cp -av /mnt/old/* /mnt/new/*
    6) Mount /proc, /dev, /sys (mount -t proc none /mnt/new/proc; mount -o bind /dev /mnt/new/dev; mount -o bind /sys /mnty/new/sys)
    7) chroot /mnt/new /bin/bash
    8) Modify any configuration files (ex. fstab and menu.list)
    9) Rebuild initrd image mkinitcpio -p kernel26
    10) Install grub
    11) unmount everything
    12) reboot
    Edit: Don't be afraid to ask for more help
    Last edited by pyther (2010-03-08 22:29:55)

  • Abap dictionary activation problem when transfer request dev to prod

    hi experts
    i develop enhancement in that i append structure in afru table.
    when transfering requset dev to production there is error
    abap dictionary activation .
    thanks
    ajay

    hi
    problem not solve.
    see below error.
      Activate table ZZAFRUD
       Field RPM: Component type or domain used not active or does not exist
       Nametab for table ZZAFRUD cannot be generated
       Table ZZAFRUD was not activated.
    structure zzafrud is active but in error it shows not active.
    i used domain zrpm it also active.
    thanks
    ajay.

  • Books transferred from Itunes to Ibooks loses author and category information.

    How can I get author and category information for pdfs to transfer from itunes into the ibooks app on Mavericks?  I just transferred my book collection from itunes to the ibooks app and have lost all the author and category information that I entered on all my pdfs.  Do I now have to re establish this info in ibooks app?  Ibooks doesn't appear to allow for creation of new categories or editing of author's name when it assigned unknown to all my pdfs.  My book library appears to have been seriously downgraded by this "upgraded" os.

    totally agree with you. Exactly the same here.
    you can't re-establish this info in iBooks because it doesn't allow you to do it. No editing and organising at all (except from moving into a section of your choice).
    In addition to no editing/changing, no backup/restore, no books/pdf location settings (all placed in user-library), etc. ... simply a complete disaster.
    For me a killer-criteria to use OSX Mavericks! That forces me to move back to OS X mountain lion!
    What did the Apple iBooks dev team think when creating this application? Come on this can't be true!

  • Old computer - slow transfers for sata

    hi all,
    I've set up a linux box with very old stuff. I'm talking about a Pentium II 350 MHz on a Asus P5B, 192 MB RAM. Though I have some new disks, with which I am experienceing slow transfers and I would like to know some advices on what could I do. Or maybe to know that the answer is: the cpu power is too low to have decent speeds.
    Now, briefly what I've is:
    - on a PCI, an old Promise FastTrak TX4 sata 150
    - on it, 3 WD20EARS 2TB disks
    - consider 2 of them. They have been partitioned with parted
    - the partition method has been (for both): gpt label, mkpart with starting sector 2048s and end -1. This should have been guaranteed alignment for the 4K sectors of the drive
    - filesystem is lvm for both. each is a single big pv on single vg on a single lv (consider one is the drive where I store all my photos - 2TB big, the other is the drive where I store backups of them - 2TB)
    when I transfer files between them (tens of folders of hundreds of files size 600kb-4mb) for backup with rsnapshot I see the following, using iostat 5 -m:
    avg-cpu: %user %nice %system %iowait %steal %idle
    37.33 0.80 60.88 1.00 0.00 0.00
    Device: tps MB_read/s MB_wrtn/s MB_read MB_wrtn
    sda 0.00 0.00 0.00 0 0
    sdb 83.03 5.69 0.01 28 0
    sdc 19.96 0.00 5.63 0 28
    scd0 0.00 0.00 0.00 0 0
    sdd 14.57 0.19 0.02 0 0
    dm-0 1440.52 0.00 5.63 0 28
    dm-1 0.00 0.00 0.00 0 0
    dm-2 1449.10 5.66 0.00 28 0
    dm-3 1449.10 5.66 0.00 28 0
    dm-4 2.00 0.00 0.01 0 0
    now the question: is there a way to get better speeds and to lower cpu load?
    please note that, according to hdparm, everything should be correct (just an output from hdparm -I is shown here, they are exactly the same):
    ATA device, with non-removable media
    Model Number: WDC WD20EARS-00MVWB0
    Serial Number: WD-WCAZA2284951
    Firmware Revision: 51.0AB51
    Transport: Serial, SATA 1.0a, SATA II Extensions, SATA Rev 2.5, SATA Rev 2.6
    Standards:
    Supported: 8 7 6 5
    Likely used: 8
    Configuration:
    Logical max current
    cylinders 16383 16383
    heads 16 16
    sectors/track 63 63
    CHS current addressable sectors: 16514064
    LBA user addressable sectors: 268435455
    LBA48 user addressable sectors: 3907029168
    Logical/Physical Sector size: 512 bytes
    device size with M = 1024*1024: 1907729 MBytes
    device size with M = 1000*1000: 2000398 MBytes (2000 GB)
    cache/buffer size = unknown
    Capabilities:
    LBA, IORDY(can be disabled)
    Queue depth: 32
    Standby timer values: spec'd by Standard, with device specific minimum
    R/W multiple sector transfer: Max = 16 Current = 0
    Recommended acoustic management value: 128, current value: 254
    DMA: mdma0 mdma1 mdma2 udma0 udma1 udma2 udma3 udma4 udma5 *udma6
    Cycle time: min=120ns recommended=120ns
    PIO: pio0 pio1 pio2 pio3 pio4
    Cycle time: no flow control=120ns IORDY flow control=120ns
    Commands/features:
    Enabled Supported:
    * SMART feature set
    Security Mode feature set
    * Power Management feature set
    * Write cache
    * Look-ahead
    * Host Protected Area feature set
    * WRITE_BUFFER command
    * READ_BUFFER command
    * NOP cmd
    * DOWNLOAD_MICROCODE
    Power-Up In Standby feature set
    * SET_FEATURES required to spinup after power up
    SET_MAX security extension
    Automatic Acoustic Management feature set
    * 48-bit Address feature set
    * Device Configuration Overlay feature set
    * Mandatory FLUSH_CACHE
    * FLUSH_CACHE_EXT
    * SMART error logging
    * SMART self-test
    * General Purpose Logging feature set
    * 64-bit World wide name
    * WRITE_UNCORRECTABLE_EXT command
    * {READ,WRITE}_DMA_EXT_GPL commands
    * Segmented DOWNLOAD_MICROCODE
    * Gen1 signaling speed (1.5Gb/s)
    * Gen2 signaling speed (3.0Gb/s)
    * Native Command Queueing (NCQ)
    * Host-initiated interface power management
    * Phy event counters
    * NCQ priority information
    DMA Setup Auto-Activate optimization
    * Software settings preservation
    * SMART Command Transport (SCT) feature set
    * SCT LBA Segment Access (AC2)
    * SCT Features Control (AC4)
    * SCT Data Tables (AC5)
    unknown 206[12] (vendor specific)
    unknown 206[13] (vendor specific)
    Security:
    Master password revision code = 65534
    supported
    not enabled
    not locked
    not frozen
    not expired: security count
    supported: enhanced erase
    402min for SECURITY ERASE UNIT. 402min for ENHANCED SECURITY ERASE UNIT.
    Logical Unit WWN Device Identifier: 50014ee205346fd3
    NAA : 5
    IEEE OUI : 0014ee
    Unique ID : 205346fd3
    uname -a is:
    Linux myhost 2.6.36-ARCH #1 SMP PREEMPT Sat Jan 8 13:16:43 UTC 2011 i686 Pentium II (Deschutes) GenuineIntel GNU/Linux
    any help?
    thanks a lot
    Gemon

    I will try adding some ram, I should have 128mb on my wife's brother pc.
    Just some quick tests with dd to see if it is a problem of the pci channel - apparently it looks like. Tomorrow I will try to move one of the two disks on the other sata controller I have on the other pci slot.
    First test is writing a sequential file to a WD20EARS (/dev/sdbx or /photo/photo as lvm)
    Second test is reading from that same disk. 75 mb/s looks like a nice speed, considering the hosting pc anc pci slot bound at 133 mb/s bw
    Third test is copying from one WD20EARS to the other
    Fourth test is copying from one WD20EARS to the PATA drive where I have linux installed - consider this is on the motherboard controller and can do udma/33 only.
    [root@myhost gemon]# dd if=/dev/zero of=/photo/photo/output.img bs=8k count=256k
    262144+0 records in
    262144+0 records out
    2147483648 bytes (2.1 GB) copied, 146.265 s, 14.7 MB/s
    [root@myhost gemon]# dd if=/photo/photo/output.img of=/dev/null bs=8k count=256k
    262144+0 records in
    262144+0 records out
    2147483648 bytes (2.1 GB) copied, 28.3661 s, 75.7 MB/s
    [root@myhost gemon]# dd if=/photo/photo/output.img of=/backup-photo/output.img bs=8k count=256k
    262144+0 records in
    262144+0 records out
    2147483648 bytes (2.1 GB) copied, 93.4719 s, 23.0 MB/s
    [root@myhost gemon]# dd if=/photo/photo/output.img of=/home/gemon/output.img bs=8k count=256k
    262144+0 records in
    262144+0 records out
    2147483648 bytes (2.1 GB) copied, 116.291 s, 18.5 MB/s
    tomorrow I'll redo some tests with the secondo wd20ears on the other pci slot.
    thanks everybody for the first answers
    Gemon
    PS: filesystem is ext4 on all those lvm logical volumes

  • WEBI reports DEV to QA to PROD

    Hi. I need help understanding how WEBI reports are moved from Dev to Qa to Prod?
    Thanks.

    Hi
    you have three option:
    1) Use the Import Wizard to transfer your reports, universes and universe connections but in this case you have to modify the universe connections after each transport.
    2) Use the Import Wizard  to transfer initially your reports, universes and universe connections. After the initial trasmfer modify the universe connections in the QA and PRD environments according to your needs. From this point on choose to transfer only the reports and the universes but avoid transfering the related connections. This way the modified connections are not overwritten and you do not have to modify them in the future.
    3) Use the LifeCycle manager (You have to upgrade to XI 3.1 for this).
    Regards,
    Stratos

  • Business system not in dev but in QA

    Hi,
    we have a scenario where  ECC sends a message to PI which sends a web service request out to a provider.
    We have a only 1 ECC system in DEV but 2 ECC systems in QA ... one for QA and 1 for training.
    Now we want both ECC systems to be able to send messages to the same PI system in QA and PI has to route the request to the right web service provider. How can we configure this in DEV and transport if there is only 1 ECC system in DEV ?
    Is there a way to create a virtual business system for training in DEV (I.E no technical system) and then configure with this system in DEV and then transport in QA where the real training system is... that is , the one which has a technical system.
    Thanks.

    Hi Thierry
    For QA ECC 1
    1) Export full SDL from DEV to QA.
    2) Create T.S & B.S for QA ECC 1.
    3) Create Tranport Group & provide Transport Target in QA SLD for Dev B.S.
        This will automatically convert your dev B.S to QA B.S in Configuration .
    4) Import all objects in IR & ID.
    For QA ECC 2.
    1) Create New Business system for Training ECC in QA SLD.
    2) Since we can't give 2 trasport targets for 1 dev B.S, you have create a new configuration scenario and import your new B.S .
    3) You have to do seperate configuration for sender B.S as your new B.S
    4) your I.R objects will be same for both configuration.
    Also Import the idoc metadata from both ECC in IDX2.
    When an idoc is triggred from ECC1 or ECC2 in QA, it will use the same mapping and data will be transfered to Target system.
    Regards
    Abhijit

  • How does your entity, or entities you have worked with, manage data for their dev/quality/cert environments?

    Background- We have development and certificiation environments for SAP changes. These environments are kept in sync by ensuring that we adhere to a progression for configuration changes- they always move thru the same dev, quality, ceritfication, production process with testing at each phase.
    As a business user, a challenge I encounter is having production type data available in the dev/qual/cert environments so that I can adequately test changes.  We refresh our environments periodically to a copy of production, including data.  After the refresh, no new data goes to these environments, only test data as entered by individual users and testers.  This means that tests performed soon after a refresh have a lot of relevent production data to use for testing, but as we move further thru the year, the data is less and less avialable and less relevant.
    Some of our changes are impacted by volume, but due to how we handle our environments, volume impacts are difficult to test or asess.
    How do you manage production-like data for your dev/qual/cert environments? Do you encounter this same issue or have you found a solution?
    Thank you!

    Thanks for your comments, it is clear you understand my plight.  The trouble is fully understanding what Apps and App Features are transferring data in the background any time you happen to turn WiFi to on (even if you have had it off most of the day or night).  Obviously things like Location Services can constantly be sending and receiving data from my iPhone without any action on my part.  Also if you have things like photo backup on the iCloud then each time you take a photo you are sending a copy out.  All App Updates if set to Automatic also can add up to quite a bit of Data.  Reading the News on AP or scrolling through FB News Feed is actually adding up to a lot of Data.  There could be other culprits that I am not even thinking of.  I don't want to turn Apps like Find My Phone off or turn iCloud off due to loosing the value of such a program entirely.  Again thanks for your quick response. 

  • [SOLVED] Problem transferring large amounts of data to ntfs drive

    I have about 150GB of files that I am trying to transfer from my home partition which is ext3 /dev/sdd2 to another partition which is ntfs mounted with ntfs-3g /dev/sde1 (both drives are internal sata). The files are sizes ranging from 200MB to 4GB. When the files start to move they transfer at a reasonable speed (10 to 60MB/s), but will randomly (usually after about 1 to 5GB transfered) slow down to about 500KB/s transfer speed. The computer becomes unusable at this point, and even if I cancel the transfer the computer will continue to be unusable (I must use alt+sysreq REISUB). I have tried transferring with dolphin, nautilus, and the mv command but they all will produce the same results. I have also tried this in dolphin as root with no change. If I leave dolphin running long enough I also get the message "the process for the file protocol died unexpectedly".
    There is nothing that I can tell is wrong with the drives, I've run disk checks on both, and checked the S.M.A.R.T. readings for both disks and everything was fine.
    My hardware is an intel X58 motherboard, core i7 processor, 6GB of RAM, 5 internal sata drives and 1 internal sata optical drive.
    Another thing to note is that every once in a while I will get an error message at boot saying "Disabling IRQ #19".
    This is driving me crazy as I have no idea why this is happening and when I search I can't find any solutions that work.
    If anybody knows how to solve this or can help me diagnose the problem please help.
    Thank you.
    Last edited by zetskee (2011-01-07 21:29:58)

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • Creating Configration Scenario: Transfering Intergration Scenario from IR

    Hi ALL
    I was in the process of Creating Configruation Scenario by using the Option of transfer Integration Process from IR.
    However, I hit a roadblock at the following step:
    1. Assign Service: The application component in my intergation scenario needs to be assigned to a business service and I am unable to assign it to bussines service(without a party); as it only lists and allows Business Systems to be selected.
    Please, advive how I can assign a business service to my application component of Integration Scenario.
    Thank you,
    Ritu

    Hi Ritu,
    We can assign Business Systems only, because they reside in SLD.
    Business Service is scenario specific. You need to recreate it new environment e.g. dev,quality,prodn. After business service creation, you create communication channel under it.
    So if you want you use the Business Service then create the configuration scenario without using Integration Scenario.
    You can use a Business Service if you have Party Communication. Define party and add Business service to it and you can use it while transferring integration scenario from IR.Help Links..
    Integration Scenario
    http://help.sap.com/saphelp_nw04/helpdata/en/ae/fb72dc1d0fbf4391fba23b7e8a0d55/frameset.htm
    http://www.sdn.sap.com/irj/scn/weblogs;jsessionid=(J2EE3414700)ID0395154950DB00487743182702389990End?blog=/pub/wlg/3157
    Regards,
    Srinivas

  • Transferring data between two production servers

    HI All,
    I have read weblogs in transferring scenarios from Dev to Qual to Prod.
    But I have a different requirement in which I have to transfer data from one production server to another production server without distrubing the first production server.
    Say A and B are two production servers, I wanted to transfer data from A to B on daily basis.
    Please give me some ideas on this....
    Thanks
    Veni

    Hello,
    I would suggest to use IDOC for transferring data (master \ transaction ) ... to do that you need to setup logical system (prod b) and RFC connection to connect to system prod b ...
    Also distributional model need to setup for pushing the outbound idoc into RFC connection through receiver port.
    Thanks
    Krish

Maybe you are looking for