Importing .atl DI jobs into data services.

Hi Experts,
I started learning data services one month back, I practiced and had done some assignments using data integrator tool. But, now we are using data services tool in the project.
So, is it possible to import the .atl files of those data integrator jobs into the current data services version.
Please suggest the necessary things in order to do this.
Thanks in advance.

if you have created your job in 11.7 version and now you are using Data Service XI R3 (12.x) then you can upgrade the 11.7 repository using Repository Manager, use the Repository Manager from the Data Services Installation
open repositroy manager and enter 11.7 repo details and click on Upgrade, this will upgrade you 11.7 job, Df to 12.x
or if you have already created your 12.x repository and have done some developement on that, then yes you can import the 11.7 ATL to 12.x some of the options may not be set to optimised values as done during upgrade process

Similar Messages

  • .bat or .sh file for executing jobs in data services

    are there any command line utilities where i can import the .atl files and executes the jobs in data services
    i was able to import the .atl files using 'al_engine' but not able to find the arguments to execute the jobs.
    Here is my requiremen:
    There would be 2 master scripts :
    Master_Initial_Script.bat/.sh - This would have all the pre/post checks and would call the DS Initial Job
    Master_Delta_Script.bat/.sh - This would have all the pre/post checks and would call the DS Delta Job
    Pre-Checks for Delta Job -
    If the initial job is not loaded then
                    Do not move further to execute the Delta job
    else
                    Execute the Delta job
    Post Checks for Delta Job:
    Print the statement when the job starts successfully,
    Checks the error/return code and prints Job success message in a log file

    This looks more like scheduling the job with dependency.
    Unfortunately, BODS scheduler doesnt support setting depedencies for running the jobs.
    So, the best way to go forward is export the jobs to a batch files in a particular location.
    Use the batch files with a external scheduling tool and set the dependency.
    A scheduling tool such as Control M or Redwood chronacle scheduler should be able to execute the batch files.
    Set the dependency in the scheduling tool accordingly.
    Let me know if you need any more details.
    Regards,
    Kishore Reddy L.

  • Unable to run a Real_time job in Data services.

    Hi guys,
    I have a scenario that i am reading the data from SAP using an idoc, into data services, and but i am getting an error while Configuring Real time sevices and in Service provider when we add job server over there, so the real time services are not starting. could you help me what are the possible ways to troubleshoot that.

    I'm not sure I understand you, on the CTI topic.  CTI is a fundemental protocol when it comes to CUCM/UCCX, which is the future of telphony.  No old school PBX's here, just packets.
    I was just looking at the UCCX 9.0(2) REST API and it looks like there are some stats to be farmed there, however, it's so new, I cannot really speaek to it from experience.
    Check it out:
    http://developer.cisco.com/documents/11218258/11219470/Cisco-uccx-developer-guide-902.pdf?redirect=http%3A%2F%2Fdeveloper.cisco.com%2Fweb%2Fuccxapi%2Fhome%3Fp_p_id%3Ddoc_library_summary_portlet_WAR_doclibrarysummaryportlet_INSTANCE_b8N60ctZVvHc%26p_p_lifecycle%3D0%26p_p_state%3Dnormal%26p_p_mode%3Dview%26p_p_col_id%3Dcolumn-5%26p_p_col_pos%3D1%26p_p_col_count%3D2
    Anthony Holloway
    Please use the star ratings to help drive great content to the top of searches.

  • Scheduling an automatic Jobs in Data Services

    Dear Experts,
    could you please explain how to schedule an automatic job in Data Services?
    I would like to be able to schedule a daily, weekly or monthly job.
    Any Document which explain the steps?
    Thanks
    Pat

    I would not suggest doing anything on your production data. Make change in the job as given below. Add Map operation tranform just before where you are writing to the target table. Inside map operation transform select all to discard....make sure you have selected discard for every operation codes.  In order to see what data your job is tring to write to target table. Create a template table and then join that table to tranform which is just before map operation...i...suppose you have query transform at the end and then you write to target table. Then add this query transform to map operation transform as well as to template table.
    You can now get what data your job is going to write to target without writing anything to target.
    Test this in dev first before trying in Production.
    REMEMBER: PRODUCTION IS S SACRED SPACE AND WE DON"T WRITE UNNECESARY DATA OR DO ANY MANUPULATION IN PROD TABLES.
    Thanks,

  • Can't log into Data Services

    Hello Gurus,
    I get this error message "Cannot initialize application. (BODI-1270039)" when I try to log into Data Services.
    I've tried reinstalling the Data Services (Client) but still the same problem.
    What's causing this issue?

    Hi l.v,
    In Windows, in the run option and put in the 'regredit' command and look under Hkey local and Hkey current user and make sure that you do not have BusinessObjects entry if you do then please delete them.
    It is assumed that you are only installing DS on this PC if this is not the case then you need to be extra careful and only remove the entries that belong to DS.
    Cheers
    Hai.

  • Error while running a job in Data services

    We are using Data services BO XI R3 tool.
    We have created datastore for Oracle database  8i using Microsoft ODBC DSN. This is the source datastore.
    When job is to pull the data from source to 10g target, we get the below error in log file:
    25219           1        CON-120902     6/26/2009      Data services ODBC Driver Manager unable to find
                                                                              data source <...> in $LINK_DIR/bin/odbc.ini file.
    Is this a UNIX/Windows error?
    Please let us know how this can be resolved.
    Thanks in advance!

    I restart the jobserver and it work, for a moment :P, but now says more...
    1. SQL submitted to ODBC data source <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database
    selected>.
    2. The SQL submitted is <select VERSION, SECURITYKEY, GUID from AL_VERSION where NAME = 'Repository Version' >.
    Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7>
    3. resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION, SECURITYKEY, GUID from AL_VERSION where NAME = 'Repository Version' >.>.
    SQL submitted to ODBC data source <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database
    selected>.
    4. The SQL submitted is <select VERSION from AL_VERSION>.
    Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION from AL_VERSION>.>.
    5. Cannot retrieve <Version> from the repository. Additional database information: <SQL submitted to ODBC data source
    <Prova7> resulted in error <[MySQL][ODBC 3.51 Driver][mysqld-5.0.77]No database selected>. The SQL submitted is <select
    VERSION from AL_VERSION>.>.
    I check if mysql DB has thas tables (they were created when I setup DS), but i don't know why  this erros says  No database selected  the odbc file looks correct...
    Thank you,
    Edited by: ToloPalmer on Jun 1, 2010 9:51 AM
    Edited by: ToloPalmer on Jun 1, 2010 9:58 AM

  • Steps:- Import SWCV from SLD into  Enterprise service Builder

    Hi All,
    I have created new SWCV in SLD.
    What are the STEPS  :-mport SWCV from SLD into  Enterprise service Builder
    Thanks,
    Anshul

    Abhishek,
    I am using PI 7.1. And there is no option showing  Tools --> Transfer from SLD .
    It has only Import Design Objects option.
    Please advice.
    Thanks,
    Anshul

  • Error starting batch job through Data Services Web Service

    I have a customer running our web service interface to BODS. We have not been able to successfully start any jobs due to the following error:
    Services sent the request to start batch job 'Job_Test_Web_Services' on server 'null'. The Job Server refused the request with error: Job Server not found
    The same interface works against other environments (all XI 3.1), so I need some pointers as to where it gets the server name. Sounds to me like a configuration issue, but I am not a BO DS admin and don't know where to look.
    Please help.
    Thanks.

    I was able to reproduce this issue in our own lab by going to the BO DS Server Manager and removing the servers from the configuration. Not sure if this is what is happening at this customer site, but I will check it out and post back.
    Not sure why they would not have a Job Server defined if they want to run jobs!

  • Regarding rescheduling jobs in data services management console

    Hi friends
    There are some issues which are leading to job failure. Is it possible to take a chance of rerun the job once again, means can we rerun the job in production for brief investigation? If Active, what will be the consequences?
    1)     Will it be duplication of data?
    2)     Or there any possibilities of data losingu2026u2026
    Please give suggestion and  the consequences if we do ?
    Thanks in advance

    I would not suggest doing anything on your production data. Make change in the job as given below. Add Map operation tranform just before where you are writing to the target table. Inside map operation transform select all to discard....make sure you have selected discard for every operation codes.  In order to see what data your job is tring to write to target table. Create a template table and then join that table to tranform which is just before map operation...i...suppose you have query transform at the end and then you write to target table. Then add this query transform to map operation transform as well as to template table.
    You can now get what data your job is going to write to target without writing anything to target.
    Test this in dev first before trying in Production.
    REMEMBER: PRODUCTION IS S SACRED SPACE AND WE DON"T WRITE UNNECESARY DATA OR DO ANY MANUPULATION IN PROD TABLES.
    Thanks,

  • Issue while using views in Data services Jobs

    Hi,
    In Data Services Job, i am trying to pull data from a view to a table. The view is pointing to the table which is in other database.
    The problem is when i import the source view into data services and view the data, i found one row having wrong data. The values in that row are wrong/corrupted while the same row in the source table is having correct values
    I queried the view from TOAD for that record. The values are valid.
    The data is coming wrong only in Data services. Any row in that table gets corrupted, there is no specific row.
    Hence while running the jobs i am getting errors.
    Any idea what can be the reason of getting corrupted data in view while same view when queried from TOAD gives correct values?

    hi,
    There is a possibility of unsupported data type by data service, please share the data service version,  database type and data type of column which got corrupted.
    Regards,
    M Ramesh

  • HANA view as source in Data Services

    Hi Gurus,
    I have a data flow in Data Services where an attribute view is used as source. That is data flow is running fine as of now.
    As per an updated requirement, we have added couple of fields to the view additionally and have to use them in our data flow.
    But, when I try to re-import the view into HANA, it is not allowing to do so. It is throwing an error that the table does not exist.
    However, if I delete it from my repository and import it as a new one, it is working fine.
    Could you please let me know if there is any specific way to re-import views from HANA into Data Services.
    Thanks,
    Raviteja.

    SAP does not provide officially an SAP R3  connector for Data Federator 3.x.
    There is a prototype though that you can download and use. Keep in mind though that this is no official SAP product yet..
    SAP BusinessObjects Web Intelligence Reporting for SAP ERP [original link is broken]
    Regards,
    Stratos

  • Data Services IDOC Errors when working with a newl Basic Type

    Hello,
    Just wondering if anyone else has encountered an issue similar to the one I am dealing with. We are trying to create business partners in SAP CRM from Data Services via IDOC. We are not using the standard IDOC, but instead we are using a new IDOC type.
    This IDOC type is generated by SAP CRM automatically through a transaction BDFG. This transaction creates a new basic type for an standard IDOC with the additional fields that we have added through the easy enhancement workbench (EEWB).
    The issue that we are having is that when we import the IDOC type into data services and we look at the schema many fields and sub-structures of the IDOC are duplicated. This means that we receive hundreds of errors when we try to validate our data flow. I can delete the duplicated sub structures in my query transformation but then the query transformation and the output schema and the idoc input schema are no longer identical.
    I have searched all over oss and different web forums for a solution but no one else it seems has even encountered this error. If anyone else has had a similar issue and can offer help it would be greatly appreciated.
    Bill

    Issue is resolved.
    After importing the meta data for the IDOC into data services I then proceeded to edit out all of the duplicate structures and fields from the IDOC.
    After making the IDOC structure in data services identical to the structure of the IDOC  in SAP I then proceeded to test this IDOC and it executed successfully.

  • Data Services 12.2.3.0 BODI-1112015 Adapter metadata import failed

    Hi Experts,
    I am using Data Services 12.2.3.0.
    I have an issue in importing functions through 'Adapter' type datastore into Data Services. I can open the datastore and see the list of functions available, but when I try to import them, I get the error BODI-1112015 Adapter metadata import failed.
    The setup and the errors are as below.
    The adapter datastore is setup as below.
    I built a new keystore called clientkeystore.jks in the ..\bin.Then created the .CSR file, and then imported the signed chained (I believe it's chained certificate) certificate of the server hosting the wsdl into the keystore.
    Thanks for the post http://scn.sap.com/thread/1589052 . After changing the metadata character set to utf-8, I can see a list of functions when I open this New_Datastore in Data Services. It proves that the setup for the datastore has no problem parsing the wsdl file and give me the list of functions in it. 
    However, the error appears when I try to import them.
    Error is:
    Adapter metadata import failed. Error message: (BODI-1112015) Error parsing the <TheFunctionToBeImported> included in the XML sent by the adapter to represet a function <Error importing XML Schema from file <adapter_schema_in.xsd>:<XML parser failed: Error <Schema Representation Constraint: Namespace 'http://result.form.v81.api.keysurvey.com' is referenced without <import> declaration> at line <13>, char <46> in < < xsd:schema xmln:xsd=http://www.w3.org/2001/XMLSchema" xmln:tns="http://result.form.v81.api.keystore.com" xmlns:diws="http://businessobjects.com/diwebservice" targetnamespace="http://www.businessobjects.com/diwebservice"><xsd:import namespace='http://v81.api.keysurvey.com' schemaLocation='C:\Program Files\Business Objects\BusinessObjects Data Services\ext\webservice\FormResultManagemenetgetRespondentsgetRespondents0.xsd'/>
    <xsd: import namespace='http://result.form.v81.api.keysurvey.com' schemaLocation='C:\Program Files\Business Objects\BusinessObjects Data Services\ext\webservice\FormResultManagemenetgetRespondentsgetRespondents2.xsd'/> ........
    When comparing it with the wsdl file(as below), it is worth nothing that the schemaLocation is changed to a local directory under C:\Program Files\Business Objects\BusinessObjects Data Services\ext\webservice  while it was not the case in wsdl. The schemaLocation is on the server.
    I am wondering if the redirection from the server specified in the wsdl file to the local directory has caused this error. The error 'namespace is reference without <import>' is apparently wrong as the <import> is just there.
    Or there is any other reason behind this.
    I appreciate any adivce or question from you!

    I have reached the exact same error as this post http://scn.sap.com/thread/3190403
    The error is
    [Mon Jun 18 23:14:28 2012] [error] ..\..\src\core\deployment\conf_builder.c(876) Specifyingservices and modules directories using axis2.xml but path of the library directory is not present
    [Mon Jun 18 23:14:28 2012] [error] ..\..\src\core\deployment\conf_builder.c(261) Processing transport senders failed, unable to continue
    [Mon Jun 18 23:14:28 2012] [error] ..\..\src\core\deployment\dep_engine.c(939) Populating Axis2 Configuration failed
    [Mon Jun 18 23:14:28 2012] [error] ..\..\src\core\deployment\conf_init.c(195) Loading deployment engine failed for client repository C:\Program Files (x86)\SAP BusinessObjects\Data Services\ext\webservice-c\axis2.xml
    As it is identified as an version problem, this issue is not going to be investigated any further.
    As an alternative, can try to use Oracle 11g SOAP_API.sql.

  • Change source path in batch Job in global variable in data services

    Hi Experts,
    my organization has created job in data services 3.2 to cleanse the data reading from excel flat files. the folder path was store in the global variable(I think) and now they have changed the directories hence is it throwing me below error.
    Error, Input file  does not exist please confirm existence and restart job, 16 ) >
    failed, due to error <50316>: <>>> Error, Input file  does not exist please confirm existence and restart job>. I want to update the folder path. I am sure it would be easy but I am very new to BODS.
    (12.2) 07-15-14 16:10:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Sleeping for 35.000000 seconds...  '
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> Waking up......  '
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : 'Starting the timer loop number 6...'
    (12.2) 07-15-14 16:10:43 (14232:12656) WORKFLOW: Work flow <WF_Metadata_Files> is started.
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> $G_FILENAME_IN : ALL_Metadata_SALES.xls...'
    (12.2) 07-15-14 16:10:43 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>> looking for input file name
                                                     \\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls'
    (12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB DEBUG' : '>>>  Input file Name is '
    (12.2) 07-15-14 16:11:08 (14232:12656)  PRINTFN: > 'JOB ERROR' : '>>> Error, Input file  does not exist please confirm existence and restart job'
    I want to update the folder path\\infra\finance\production\sales\Metadata\ALL_Metadata_SALES.xls to \\Home\BIData\finance\production\sales\Metadata\ALL_Metadata_SALES.xls
    when i investigated WF_Metadata_files i saw there is a global called INPUT_DIR i assume I have to change the path there. I tried to find old directory in the batch job but i cant find it and even When i give value to global variable it is still pointing to old path.
    Can anybody please help me.
    Thanks
    Tim

    Hi Tim,
    If having specified the value in the global variable it is still pointing to the old path there can be a couple of scenarios applicable
    1. There is a different global varaiable being used for the file path
    2. The filepath is hardcoded in the file-format or Excel file definition despite the declaration of the global variable.
    Are you getting this error when running a dataflow within this workflow or in a script? It will be better to run the workflow in debug mode and look through the stages to find out where exactly in the workflow it fails.
    kind regards
    Raghu

  • Data Services job server crashed and won't start backup

    Hello,
    I was running some jobs on data services 4.2 sp3  windows server 2012R2 and they all failed and the job server went down. None of the jobs that failed had an trace file or error log in the management console. Now i am unable to open data services designer or data services server manager, when I try to open them nothing happens. Also the SAP Data services job service cannot be started. The job server was running fine for a few weeks before this. This has happened twice already today the first  time the only way i was able to fix it was run the repair on the dataservices install. Can someone please help me what know what is causing this and how it can be fixed.

    Hi Tyler,
    It was Windows specific issue please refer the below link & KBA
    How To Fix Windows Service Error 1053
    http://windows-exe-errors.com/how-to-fix-windows-service-error-1053/
    1986247 - Error "Windows could not start the BusinessObjects Data Services service on local computer" occurs in Data Services 4.1
    https://service.sap.com/sap/support/notes/1986247
    1992260 - Error: Windows could not start the SAP Data Services service on local computer, after upgrading SAP data services and deleting job servers SAP Data Services 4.2
    https://service.sap.com/sap/support/notes/1992260
    Hope this will help!!!!
    Thanks,
    Daya

Maybe you are looking for