Transport of Jobs in FIM 10.0

Hello All,
Can someone please let me know if there any mechanism to transport the Jobs in FIM 10.0 from DEV to QA to PRD?
I am aware that I can import the jobs and Export in the other system. However does this work properly? As the web service link can be different in different systems.
Anybody has done this before? Any thoughts?
Regards

Hello,
I didn't try this before but there is a reference in FIM 10 User Guide about exporting/Importing job definition from one environment to another: section "15.1 Exporting a Financial Information Management Job" (p71)
For custom data services job, you need to export the custom job and then import it as described in section 17.6 "Importing a Custom Job" (p83). As mentionned in this part, if this is the first time the custom job is moved to the target environment, you need to modify the datastore properties and web service URL to point to the correct environment.
Hope this helps
Regards,
Mariam

Similar Messages

  • How to transport Background job

    Hi  Experts,
    I have created one job by  using sm36. then i have executed successfull. now i want transport those Job to my all environments like quality and production. how can i go head?
    thanks in advance.
    Cheers.
    Regards,
    ve.

    Background job has its own mechanism, it depends on many entities like user, parameters, host server etc..
    Never try to create ABAP code to transport a batch job to other system, it takes less than 2 mins to create a new one.
    You feel very flexible if you follow the SAP standards.
    Regards,
    Nick Loy

  • Transporting background job

    Hi
    How to transport the jobs details created in sm36 from development to production
    Also how to tranport the ALE Idoc configuration done in we20
    Thank
    Ajay

    Hi,
    You can create the jobs in the gold client and get it transported. (Ask Basis to do as per their company naming standards of the job description).
    WE20, is client dependent and cannot be transported, it needs to be maintained in each client across all systems.
    Basis usually does this activity as part of their cutover plan.
    Regards,
    Subramanian

  • Transport DI JOBS to Quality Server

    Hi Everyone,
    I have created one project with 2 Job, 2 DS, 2 ABAP Data flow. I have used SAP Development system as source and MSSQL as Target.
    Can any one tell me how can i transport this project to Quality Server in Source and MSSQL as target?
    Thanks in advance.
    Regards,
    Rishit

    confiration will be saved under one transaport request nymber.
    if you want to move this configuration to Quality
    SELECT THE TRANSACTION CODE  SE09
    SELECT ONLY  MODIFIABLE  .
    ALL  Transport request will appear.
    then select transport request number &click on the  "Release directly button"(which is in Lorry shape).
    then ask your BASIS PERSON TO MOVE TO QULITY SERVER(you give the transport request number to your Basis person.
    chandra
    chandra

  • Fixing standard transport import job

    Hi,
    We have R/3, APO and B/W.
    In R/3 and APO, when we move batches of transports from QA to Staging(STMS_QA) everything is fine.
    But in the past we sometime experimented problem when doing the same in B/W. Somebody once told us that when you move many transports together, the standard import job perform first step for all transports then 2nd step for all of them, etc......
    This is no problem for R/3 and APO, but we were told that does not always work in B/W. Because of that, we change our process for B/W. We run STMS_QA and the import job, one transport after the other. Since the import job is scheduled every 15 minutes, if we have 8 transports to move, it will take 2 hours to perform them all.
    I was ask to write a job that would do this automatically, then replacing the standard import job.
    Then I'm saying to myself: we are not the only ones to move transports in batch in the world? Do everybody reinvent the wheel each time like I am about to do?
    Can somebody give me some thought over this? Any suggestion?
    Thank you!

    Discussion successfully moved from SAP Transportation Management (SAP TM) to SAP NetWeaver Administrator where it belongs.
    Your recent question posted in SCN SAP Transportation Management community is in the wrong place.  SAP TM is devoted to the SAP application which manages the physical movement of goods using trucks, trains, planes, and ships.  What you posted regarding software changes from one system to another (Dev to QA, for example) should be in the Netweaver Administrator community.  Please use the correct community in the future.
    Regards, Mike (Moderator)

  • Transporting job from one server to another

    Can anybody say whether we can transport a job from one server to another. If it is possible can you tell me how?

    Hi,
    You cant transport a job from one  sever to another.
    Onlu you can transport a variant from one system to another system and there you can schedule your job.
    Reward if helpful.
    Thanks,
    Ponraj.s.

  • Is it possible back ground job can transport

    Hi Experts,
    i have scheduled one job in DEVELOPMENT environment,
    is it possible to transport DEVELOPMENT environment to TESTING environment.
    please give the way to transport.
    Thanks.

    Hi kalyan,
    We can't transport the background job.
    for more info
    Re: Why we can't transport Background jobs between systems?
    hope it helps

  • Creating a new job and position prompts customizing request...

    Dear All,
            I created a Job and Position through PP01 transaction. I am facing 2 issues while doing so...
    1. System prompts for Customizing request when i try to save the new entry.
        What table/settings do i need to maintain for turning of the customizing request promptings.
    2. Once i save the entries(after entering some dummy customizing requests), when i check the
        HRP1000 table for the new jobs and positions... The table entry shows 2 records for each Job and     
        position. One entry with Plan version '.:' and another with active Plan version '1'.
        I need to have just one entry in HRP1000 table for each job/position with ACTIVE plan version.
       I checked T77S0 table, it has active plan version maintained against the PLOGI-PLOGI entry.
       Please let me know what table i need to maintain to handle my requirement.
    Regards
    Nanda

    Hi Nanda,
    When you create new job or position in sandbox system it will not ask for the request number,since you generally dont transport the objects from sandbox to test system.
    When you create a new job or position in development system, it will always ask for request number. This is because, in most of the cases, you need to transport the job and position to test system and then to production system.
    Without a customizing request  you wont be able to transport those changes to next system.
    Thanks,
    Supriya.

  • Tidal Transporter

    Any one else using transporter to move between non-prod and prod environments?  Any "best practices" you can offer?

    There should be a new version for 6X but I can't find anything on Cisco Site
    Some of the Best Practices Above are 5.3.1 specific
    Many issues are fixed in 6X like multiple concurrent connections
    Transporter Best Practices 5/25/2011
    Remember that Transporter uses a client connection license, so don’t leave it connected.
    Create a mapping file for each environment that may be shared.
    If you have two masters (one production and one test), you will have two mapping files, one from test to production and the other from production to test.
    The mapping files reside on the network drive, along with the log files and saved transport selections.
    Always prompt for a map update. This is set in the configuration.
    Copy jobs to production as disabled if they are new jobs.
    Manually enable the new jobs after transport to the production environment via the Client.
    Existing jobs should always be copied in the same state that they are in production. If you copy an existing enabled job as disabled, the job will be removed from the existing schedules until you enable the job via the Client.
    Save selections for jobs/groups that are constantly being transported. These files should be kept in the same location as the mapping files.
    Archive activity logs. Make sure that the activity logs on the shared network drive are backed up. You will need to decide what your retention policy is and take action to implement it. Transporter does not maintain the logs after they are created.
    Disable auto select dependencies for jobs with complex or many dependencies to enhance performance. All the auto select options should be configured in the off position. Turn them on only when needed. Remember to turn them off again after use.
    To replace an existing job in the production environment, use the “includes duplicates” option in Transporter. This can be done in the configuration or by right clicking in the transport console when you have Jobs/Groups selected.
    Avoid changing the effective date when transporting jobs unless you want all the jobs being transported to have a future start date. If a job is required to be scheduled in the future, use the TIDAL Client to schedule the job after the transport
    Commit / rollback option in the configuration should only be used when no other users are creating jobs with the Client. This option may lock other users that are using the Transporter or the Client.
    Transport your scheduling types based on the following sequence:
    Calendar, Variables, Resources, and Job classes should be transported before all other items.
    Actions must be transported before Events, with the exception of Job Actions. They can only be transported after the Job that they are putting into the schedule has been transported.
    Jobs/Job Groups must be transported after all the other types have been transported.
    Set restriction security on non-super users in the Client via the General Category in the Security Policy – “Move Jobs to Production” or “Move Own Jobs to Production”.
    Transport Archive should be enabled in the configuration. This will enable you to back out a change if needed. This is done by using the Client to remove the newly transported job, and renaming the Archived job name and changing the parent group to the correct location in the Job Definitions.

  • Error in Transport Tablespace from linux to windows

    I am testing the cross-Platform Transport Tablespace. As per the oracle, we can transport tablespace from linux to windows without conversion because both are using same endian (Little).
    But i am fail to do Transport Tablespace from Linux to Windows.
    I am performing Transport Tablespace process as following:
    from Source Oracle Database server(red had linux as 4 32-bit oracle version:10.2)
    Sql> alter tablespace TEST read only;
    $ expdp system/pass dumpfile=test.dmp directory=export_dir transport_tablespaces=test transport_full_check=y
    after this i am coping test.dmp and data file (test.dbf) to the target machine (ms windows xp 32-bit with oracle 10.1) .
    on Target Machine (with Ms windows xp os) here i am giving the following command:
    impdp system dumpfile=test.dmp directory=exp_dir transport_datafiles=/exp_dir/test.dbf
    but it is giving following error:
    ora-39001: invalid argument value
    ora-39000: bad dump file specification
    ora-31619: invalid dump file "c:\pks\1103.dmp"
    what may by ...
    Prabhaker

    now for version compatibility i am inclusding version option with expdp
    edpdp scott dumpfile=1103.dmp directory=pks transport_tablespaces=prabhu version=10.1.0.2.0
    but now it is giving following error:
    Import: Release 10.1.0.2.0 - Production on Saturday, 11 March, 2006 19:07
    Copyright (c) 2003, Oracle. All rights reserved.
    Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options
    Master table "SCOTT"."SYS_IMPORT_TRANSPORTABLE_01" successfully loaded/unloaded
    Starting "SCOTT"."SYS_IMPORT_TRANSPORTABLE_01": scott/******** DUMPFILE=1103.DMP DIRECTORY=PKS TRANSPORT_DATAFILES=C:\PKS\PRABHU version=10.1.0
    Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
    ORA-39123: Data Pump transportable tablespace job aborted
    ORA-06550: line 2, column 2:
    PLS-00306: wrong number or types of arguments in call to 'BEGINIMPORT'
    ORA-06550: line 2, column 2:
    PL/SQL: Statement ignored
    Job "SCOTT"."SYS_IMPORT_TRANSPORTABLE_01" stopped due to fatal error at 19:07
    regards
    Prabhu

  • Transports in SCM

    Hi,
             When we transport the development in DP dev box to QA, what objects can be transported from Dev to QA and what needs to be recreated in QA? To my knowledge, we can transport POS, PA and planning books. What about the rest like forecast profiles, selections, alert profiles etc.. Is there a place where we can collect all objects and assign them to a transport?
    Any links are very appreciated.
    Thanks.

    Hi, Somnath,
    Actually, I want only specific forecast profiles to be transported. When I transported the jobs which contained these forecast profiles, the jobs and the activities got transported, while the Forecast Profiled didnt.
    The trnxn you said is to transport the entire planning area, and all the forecast profiles associated with that planning area, which actually is not the requirement in my case...
    So, the question now briefs out to: How to transport specific forecast profiles? Or else, would they need to be created in the subsequent system?
    Thank you very much for your reply....
    Regards,
    Gurucharan.

  • Transportable tablespace errors

    Hi experts
    Could you please tell me how to resolve this?
    connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Master table "ABCD"."SYS_IMPORT_TRANSPORTABLE_01" successfully loaded/unloaded
    Starting "XYZ"."SYS_IMPORT_TRANSPORTABLE_01": DDEEF/******** directory = tt_import_dir dumpfile = dev_tt.dmp transport_datafiles = /apps/oracle/oradata/dev/datafile1_data-01.dbf, /apps/oracle/oradata/dev/datafile2_data-01.dbf, /apps/oracle/oradata/dev/datafile3_data-01.dbf remap_schema=dev_xxx:xxx14 remap_schema=dev_yyyy:yyyyy14 remap_schema=dev_zzzz:zzzz14 remap_tablespace=dev_zzzzz_data:zzzz14_data remap_tablespace=dev_yyyyy_data:yyyyt14_data remap_tablespace=dev_zzzzz_data:zzzzz14_data
    Processing object type TRANSPORTABLE_EXPORT/PLUGTS_BLK
    ORA-39123: Data Pump transportable tablespace job aborted
    ORA-19721: Cannot find datafile with absolute file number 11 in tablespace REPORT14_DATA
    Job "ABDC"."SYS_IMPORT_TRANSPORTABLE_01" stopped due to fatal error at 10:48:35
    thanks
    seema

    Seema,
    This is what is mentioned about the error,
    >
    ORA-19721:     Cannot find datafile with absolute file number string in tablespace string
    Cause:     Can not find one of the datafile that should be in the Pluggable Set.
    Action:     Make sure all datafiles are specified via import command line option or parameter files. >
    So it seems the exported dump is not complete. Retry by taking export again and than try importing it .
    HTH
    Aman....

  • Automatic transport configuration

    Hi all,
    i am new with the transport systems, and I need some help.
    I have the next scenario:
    A global development system:
    system: 1
    development
    mandant 100
    A global quality system:
    system: 1
    quality
    mandant 100
    And additionally I have 4 local development systems:
    systems: 2, 3, 4 and 5
    development
    mandant 200
    Currently the transport between systems are manually, transaction STMS with Import (from DEV to QA) and transmission (from QA to local DEVs with and approval queue). And then I need to import in all the local DEVs this transports.
    I'd like do it automatic, at least the part of the QA to local DEVs.
    Can I do all transmissions in only one step, do a multiple system transmission? or can I do directly the import to de local systems without transmission from the approval queue?
    Could somebody give me an idea?
    Regards, Raúl Perea

    Hello Raul,
    As I understand it transport routes are in place and are working fine. Now you need to just to schedule import jobs for automising the imports.
    1. For this go STMS transaction and then overview. Then choose the system in which automatic imprt is to be scheduled. The import queue of the system will  show.
    2. In the application tool bar you will find 2 TRUCK shaped icons : one for Import All , one for Single Import.
    3. Select the Import All truck. A pop up will come.
    4. Under Date/deadline select radio button At Start Time give the date and time of imports. In period field (which is actually disabled for filling in values) do an F4 and give the required period. Then select radio button after event and select the check box Execute Import periodically. However after this again select At start time radio button.
    5. Go to tabstrip Execution and select Execute synchronously and in the options tab stip select all check boxes.
    6. Press enter and say yesto the resulting pop up.
    7. Your job is scheduled for a particular system. Go to SM37 give the job name as TMS* . You will be able to find it. Also change it is job class to A. That is useful for transport import jobs.
    Please award points if answer was useful.
    Regards.
    Ruchit.

  • Transportar JOBS de un ambiente a otro

    Hola a todos!
    Estoy creando varios jobs con programas abap y comandos externos. Quisiera transportarlos de un ambiente a otro. Por ejemplo, del ambiente de desarrollo a calidad o de calidad a productivo. Hay alguna manera que no sea volviendolos a crear por mi cuenta?
    Agradeciendo su pronta respuesta!

    Negativo, al parecer no se puede hacer transporte de JOBS. Yo he preguntado a varios consultores, y todos me han dicho que no, que hay que crearlos en cada mandante.
    Slds!

  • Transportable tablepace set

    Hi
    I have included all tablespaces in transportable tablespace set except system,sysaux,undo,temp.
    However, I am still unable to transport tablespaces.
    I am getting below error:
    ORA-39123: Data Pump transportable tablespace job aborted
    ORA-29341: The transportable set is not self-contained
    Whats the reason for this?

    Just refer
    http://mohamedazar.wordpress.com/2010/05/29/transport-the-tablespace-on-same-os-platform/

Maybe you are looking for

  • Multiple errors in Apex 3.1.1 help (EBS mod_plsql issue?)

    Hello everybody, I am having a couple of problems with the online help in Apex and I cannot figure out how to solve them. First problem: Whenever I open the help from the Application Builder I get an error "Forbidden You don't have permission to acce

  • How can i save the output  of  report into  a file,.

    Hi friends,              i was create a executable program .. it was executing fine.. imagine.. my program is just fetching the data from  LFA1  table based on some conditions.. every thing is fine.. and i'm getting the output also.. but i want to sa

  • Disk errors in Windows Vista

    I have new Fujitsu Notebook running Windows Vista home premium. Everytime I plug in my iPod to sync it, I get an error window saying that I need to scan my drive for errors. It does not restrict access to my iPod in iTunes but it does slow the sync p

  • IPhone 5S and 3G issue

    Hi everyone! I have a problem with using 3G on my A1533 iPhone 5S. I meet it not every time  I turns 3G on. I usually use EDGE to increase battery life. Sometimes when I'm turning 3G on it turns on normally but in a few moments 3G logo in status bar

  • Items in the invoice are being duplicated...

    Hello, We have invoice and that invoice has 6 items with diff prises. but 2 of the items with are having same information and they have been duplicated and same item is being charged twise... does any1 have any idea? Please let me know Regards, lata.