Data Services as a data migration tool

Hello All,
Has anybody used Data Services as a data migrition tool from a SAP ECC system?  If so what is it like to use and are there any documents on this?
Cheers,
Nick.

Hi Nick.
About documents, here are some of them, but one that I think it could be useful for your understanding is the "BI109 SAP Data Migration" session: http://www.sdn.sap.com/irj/scn/shop?rid=/media/uuid/c08b931e-2a83-2c10-2aba-cb3968c5bc4e.
It shows the data migration framework based on SAP BusinessObjects technologies for legacy to SAP and SAP to SAP migrations.
More technical documents, showing how to set up iDocs, LSMW, etc:
http://help.sap.com/bp_dmg603v1/DMS_US/Documentation/DM_installation_guide_EN_US.doc
http://help.sap.com/bp_dmg603/DMS_US/Documentation/DM_Quick_Guide_EN_US.doc
Regards,
July

Similar Messages

  • Difference Between Data Services Designer and Data Services Workbench

    Hello All,
    I am new to Data Services .
    What is the difference between Data Services Designer and Data Services Workbench .
    Am bit confused in the above two .
    Please help me to understand the same.
    Thanks in advance.
    Aisurya

    Workbench is used to create, display and modify the objects. It will display the source table data and we can see the logs of the job which we have executed and also we can see the status of a job. In bods 4.2 you can design the dataflow in workbench in previous release we don’t have that option but designer contains debugging option, you can write scripts,  it will support all databases; these option are not available in workbench. for more information refer this document:
    https://decisionfirst.files.wordpress.com/2014/07/data-services-workbench-intro.pdf
    http://scn.sap.com/community/data-services/blog/2014/03/01/data-services-42-workbench
    http://scn.sap.com/community/data-services/blog/2013/01/24/data-services-workbench-part-1

  • Migration from Data Services 3.2 - Data Services 4.0 document

    Hi Experts,
          Can somebody provide me a document or link where i can find Migration or Process Flow from Data Services 3.2 to Data Services 4.0?
    Any help in advance is appreciated
    Thanks
    AJ

    Hi,
    check the DS Upgrade Guide:
    https://websmp109.sap-ag.de/~sapidb/011000358700001323242010E/sbo401_ds_upgrade_en.pdf
    You need a S-User with Password for login.
    Regards
    -Seb.

  • ODSI csv file creation from a data service that read data from many tables.

    Hi,
    We have few dataservices running in our enviroment. Now we have a requirement to save the data as CSV files on demand. My question, can we do this using ODSI 10gR3?
    I know we can read the csv file and create a physical data service. We have a compleded a use case that data comes in CSV file and saved to Oracle table when required.
    Now how to do the reverse is my question. I am not findind any help in documents for this. I have opend a case with Oracle support for this, looks like they are not understading and taking it as a issue in ODSI 10gR3.
    Please help.
    Thanks in advance.
    Rahul

    we have a requirement to save the data as CSV files on demand. My question, can we do this using ODSI 10gR3? Yes. But you have to write the java code to do it. Create a class with a static method that takes two arguments - an array of XmlObject (or of the specific type that extends XmlObject that is created by compiling the schema) and a filename - and writes the values of the children of the array elements into your CSV file.
    Put that class in a jar and put the jar in DSP-INF/lib, then create a physical datasource -> Java from that method, make sure you specify that resulting datasource procedure is a libraryProcedure.
    So you'll have to do a little homework on ODSI Java Functions with XmlObject arguments.
    - Mike

  • Connecting to ADO Data Services using XML Data

    Has anyone attempted to connect to a database using ADO Data Services? My understanding of ADO Data Services is that it allows querying of the database in a REST format. So if I publish a site, I can get to its data with a URL like this:
    http://mysite/myservice.svc/entity where entity is a defined entity like Product, or supplier. Going to this site returns data that I have included at the end of this post, with the Entry node repeating (I only included the first Entry node). I definitely cannot use Web Service as my connection method, because there is no WSDL, but I was thinking maybe I could use XML. However, I have read the help file and several forum posts and I can't seem to figure out how to make XML work.
    <?xml version="1.0" encoding="utf-8" standalone="yes" ?>
    - <feed xml:base="http://mysite/FpWebDataService.svc/" xmlns:d="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata" xmlns="http://www.w3.org/2005/Atom">
      <title type="text">InfrastructureTickets</title>
      <id>http://mysiteFpWebDataService.svc/InfrastructureTickets</id>
      <updated>2008-12-22T14:44:03Z</updated>
      <link rel="self" title="InfrastructureTickets" href="InfrastructureTickets" />
    - <entry>
      <id>http://mysite/FpWebDataService.svc/InfrastructureTickets(1)</id>
      <title type="text" />
      <updated>2008-12-22T14:44:03Z</updated>
    + <author>
      <name />
      </author>
      <link rel="edit" title="InfrastructureTickets" href="InfrastructureTickets(1)" />
      <category term="FootprintsModel.InfrastructureTickets" scheme="http://schemas.microsoft.com/ado/2007/08/dataservices/scheme" />
    - <content type="application/xml">
    - <m:properties>
      <d:TicketID m:type="Edm.Int32">1</d:TicketID>
      <d:UserLastName>Phillips</d:UserLastName>
      <d:UserFirstName>Michael</d:UserFirstName>
      <d:TicketTitle>FAStT 700 updates</d:TicketTitle>
      <d:TicketAssigned xml:space="preserve">miphillips</d:TicketAssigned>
      <d:DateSubmitted>Jul 18 2005 9:55AM</d:DateSubmitted>
      <d:ChangeRequestType>BIOS__fFirmware__bUpdate</d:ChangeRequestType>
      <d:ChangeImpact>Major</d:ChangeImpact>
      <d:ChangeRisk m:null="true" />
      <d:OutageRootCause m:null="true" />
      <d:SolutionType m:null="true" />
      <d:TicketType m:null="true" />
      <d:ChangeReason>Maintenance</d:ChangeReason>
      <d:OutageScope m:null="true" />
      <d:OutageManagerDescription m:null="true" />
      <d:ProblemCategory m:null="true" />
      <d:SubmissionType m:null="true" />
      <d:TicketPriority m:type="Edm.Int32">3</d:TicketPriority>
      <d:TicketStatus>_DELETED_</d:TicketStatus>
      <d:UserID>MiPhillips</d:UserID>
      <d:SubmitDate m:type="Edm.DateTime">2005-07-18T09:55:51</d:SubmitDate>
      <d:UpdateDate m:type="Edm.DateTime">2005-07-19T09:26:41</d:UpdateDate>
      <d:TicketAge m:type="Edm.Int32">1253</d:TicketAge>
      <d:TicketUpdateAge m:type="Edm.Int32">1252</d:TicketUpdateAge>
      <d:DescriptionCount m:type="Edm.Int32">2</d:DescriptionCount>
      </m:properties>
      </content>
      </entry>

    Hi lilutchay,
    If the SQL Server Integration Services is Enterprise or Developer edition of SQL Server 2008, we can use
    Microsoft Connector Version 1.2 for Oracle to connect Oracle databases with version Oracle 9.2.0.4 or higher, Oracle 10.x or Oracle 11.x. If the SQL Server Integration Services is Enterprise or Developer edition of SQL Server 2012, we can use
    Microsoft Connector Version 2.0 for Oracle to connect Oracle databases with version Oracle 9.2.0.4 or higher, Oracle 10.x or Oracle 11.x.
    While if we want to connect to other Oracle version databases, we can use
    OLE DB provider for Oracle which has been tested in SSIS labs and is recommended by Microsoft.
    For the detail steps for Microsoft Connector for Oracle, we can refer to the first blog suggested by Deric. For more details about the use of connecting to oracle using Oracle Provider for OLEDB from SSIS, please refer to the blog below:
    http://www.databasefun.com/archives/591
    Reference:
    SSIS with Oracle Connectors
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • FB4, Data Services&PHP returned data is lost

    Hi,
    I set up an RPC call using the data services, all works fine so far. Fb built a dataType for me holding all the returned variables. Now I let Fb generate the communication forms for me. This results in a form where I put in the parameters and hit the button that triggers the call. And a second form that is supposed to show the results. Now whereas the call of the rpc works fine, the results are never shown. When I dump in the php what is supposed to be returned all looks fine.
    As far as I understand Fb sets up an RPCManager that is supposed to handle the result. But I do not understand how to trace the callback. (I used to use the HTTPService where I set up a callback event manually, this was easy to trace but I am kind of stuck with the RPCManager).
    Is there a possibility to track the callback function that is supposed to handle the return values of the rpc call? PHP returns an associative array, when I configure the Data service in Fb all fields are correctly identified. What could be a reason for the result form not to show anything? Using RPCs that return a single string only works fine!
    THX in advance.

    Just to add some infos:
    Configuring the service works fine: Fb shows that an object consisting of five strings is returned and this is exactly what happens.
    But when using this service the lastResult value is always NULL. That seems strange to me.
    old amfphp used to have a service browser is there anything comparable for the ZendFramework? I´d love to find out where the data is lost. Resp. what the difference is between the calls when configuring and the calls coming from the application...

  • Due date calculation from GR Date / Service entry sheet date

    Dear Friends,
    GRN Date--01.05.2009
    Payment terms--- Within 45 days due net
    Transaction code---FBL1N
    System should not output these records before 14.06.2009 in FBL1N.
    It should ouput on or after 14.06.2009.
    Regards
    R Panda
    Edited by: Rutabhadra Panda on Jun 2, 2009 9:18 AM

    Hi
    Try with payment terms option
    SPRO- SAP IMG- Material Management - Purchasing - Vendor Master - Define terms of payment
    regards,
    Raman

  • What is Data services in Business Objects

    Hi Folks,
    Can I know What is Data services and
    Where we can consider usage or need for this tool.

    Hi,
    Data Services consists of Data Quality and Data Intergrator that'll help in gathering, clensing and loading data from multple data sources into a single datawarehouse. Reporting and analysis for BI purpose can then be done on the final datawarehouse.
    Please find Data Services product guides and documents at [http://help.sap.com/content/bobj/bobj/index.htm]
    Regards,
    Subhodeep

  • R/3 data flow is timing out in Data Services

    I have created an R/3 data flow to pull some AP data in from SAP into Data Services.  This data flow outputs to a query object to select columns and then outputs to a table in the repository.  However the connection to SAP is not working correctly.  When I try to process the data flow it just idles for an hour until the SAP timeout throws an error.  Here is the error:
    R/3 CallReceive error <Function Z_AW_RFC_ABAP_INSTALL_AND_RUN: connection closed without message (CM_NO_DATA_RECEIVED)
    I have tested authorizations by adding SAP_ALL to the service account I'm using and the problem persists.
    Also, the transports have all been loaded correctly.
    My thought is that it is related to the setting that controls the method of generating and executing the ABAP code for the data flow, but I can't find any good documentation that describes this, and my trial and error method so far has not produced results.
    Any help is greatly appreciated.
    Thanks,
    Matt

    You can't find any good documentation??? I am working my butt off just.......just kiddin'
    I'd suggest we divide the question into two parts:
    My dataflow takes a very long time, how can I prevent the timeout after an hour? Answer:
    Edit the datastore, there is a flag called "execute in background" to be enabled. With that the abap is submitted as a background spool job, hence does not have the dialog-mode timeout. Another advantage is, you can watch it running by brwosing the spool jobs from the SAP GUI.
    The other question seems to be, why does it take that long even? Answer:
    Either the ABAP takes that long because of the data volume.
    Or the ABAP is not performing well, e.g. join via ABAP loops with the wrong table as inner.
    Another typical reason is to use direct_download as transfer method. This is fine for testing but it takes a very long time to download data via the GUI_DOWNLOAD ABAP function. And the download time would be part of the ABAP execution.
    So my first set of questions would be
    a) How complex is the dataflow, is it just source - query - data_transfer or are there joins, lookups etc?
    b) What is the volume of the table(s)?
    c) What is your transfer method?
    d) Have you had a look at the generated abap? (in the R/3 dataflow open the menu Validation -> Generate ABAP)
    btw, some docs: https://wiki.sdn.sap.com:443/wiki/display/BOBJ/ConnectingtoSAP

  • Unable to use extractor 2LIS_11_VAHDR with Data Services

    Hi,
    We are currently trying to use Data Services to extract data from ECC to HANA using standard Business Content extractors. BW is beyond the scope of this project and we want to build a custom SQL data mart using Data Services and HANA.
    We would like to use the extractor 2LIS_11_VAHDR to extract Sales Order Header information. However, when we try to fill the setup tables for application 11, we see the error ‘No extraction structure active or no BW connected’:
    We have confirmed that the data sources are active in both t-codes RSA6 and LBWE. We are also able to pull the DataSource up in t-code RSA3 but obviously no records are returned because the setup tables are empty.
    RSA6
    LBWE
    RSA3
    Please advise as to any other steps needed to allow the filling of the setup tables. Your help is highly appreciated.
    Thanks,
    Abhi

    We were able to resolve this issue with SAP's help. Here is the solution.
    If you want to be able to execute OLI7BW, you need to either connect a BW system to the source system or create at least one subscription within Operational Delta Queue.
    If you do not want to connect a BW system, this means you would first have to issue a CDC request from Data Service for a 2LIS_11* data source (e.g. 2LIS_11_VAITM) with an empty setup table.
    This will create an active subscription, enabling you to start the setup program OLI7BW.
    After this you should be able to request a second initial load from Data Services, by temporarily setting property "Initial load" to "Yes". This will be the "real" delta init, providing the data from the setup tables to Data Services.
    Hope it helps!

  • ADF BC and the Active Data Service

    hi
    The OFM Fusion Developer's Guide for Oracle ADF 11g Release 1 (B31974-05) has a section "42 Using the Active Data Service"
    at http://download.oracle.com/docs/cd/E15523_01/web.1111/b31974/adv_ads.htm
    that says "... If you want your components to update based on events passed into ADF Business Components, then you need to use the Active Data Proxy. ..."
    but it does not seem to explain how to use ADF BC and the Active Data Service.
    I have been able to create this example application ...
    http://www.consideringred.com/files/oracle/2010/ActiveDataServiceADFBCApp-v0.01.zip
    ... that does not have a af:poll component (but has moved polling into a managed bean).
      <managed-bean>
        <managed-bean-name>sumSalBean</managed-bean-name>
        <managed-bean-class>activedataserviceadfbcapp.view.SumSalBean</managed-bean-class>
        <managed-bean-scope>session</managed-bean-scope>
        <managed-property>
          <property-name>empSumSalVO</property-name>
          <value>#{data.activedataserviceadfbcapp_view_sumSalPagePageDef.SumSal.viewObject}</value>
        </managed-property>
      </managed-bean>This is some code in the SumSalBean class:
    package activedataserviceadfbcapp.view;
    // also based on code found in "ADF’s Active Data Service and scalar data (like activeOutputText)" by Matthias Wessendorf
    // at http://matthiaswessendorf.wordpress.com/2010/01/07/adf%E2%80%99s-active-data-service-and-scalar-data-like-activeoutputtext/
    public class SumSalBean
         extends BaseActiveDataModel
         protected static final String SUM_SAL_NAME = "sumSal";
         protected final AtomicInteger fCurrentChangeCount = new AtomicInteger(0);
         protected long fActiveDataUpdateEventTime;
         protected EmpSumSalVO fEmpSumSalVO = null;
         @PostConstruct
         public void setupActiveData()
              ActiveModelContext vActiveModelContext =
                   ActiveModelContext.getActiveModelContext();
              Object[] vKeyPath = new String[0];
              vActiveModelContext.addActiveModelInfo(this, vKeyPath, SUM_SAL_NAME);
              ScheduledExecutorService vSEService = Executors.newScheduledThreadPool(1);
              vSEService.scheduleAtFixedRate(new Runnable()
                        public void run()
                             if (hasDataChanged())
                                  triggerActiveDataUpdateEvent();
                   3, // let's wait some seconds
                   2, // period between the updates
                   TimeUnit.SECONDS);
         public void triggerActiveDataUpdateEvent()
              setActiveDataUpdateEventTime(System.currentTimeMillis());
              incrementCurrentChangeCount();
              ActiveDataUpdateEvent vEvent =
                   ActiveDataEventUtil.buildActiveDataUpdateEvent(
                        ActiveDataEntry.ChangeType.UPDATE,
                        getCurrentChangeCount(), new String[0], null,
                        new String[] { SUM_SAL_NAME },
                        new Object[] { getSumSal() });
              fireActiveDataUpdate(vEvent);
         public String getSumSal()
              EmpSumSalVO vEmpSumSalVO = getEmpSumSalVO();
              return "" + vEmpSumSalVO.getFirstSumSal();
         protected void startActiveData(Collection<Object> rowKeys,
              int startChangeCount)
         protected void stopActiveData(Collection<Object> rowKeys)
         public int getCurrentChangeCount()
              return fCurrentChangeCount.get();
         protected boolean hasDataChanged()
              EmpSumSalVO vEmpSumSalVO = getEmpSumSalVO();
              return vEmpSumSalVO.hasDataChanged(getActiveDataUpdateEventTime());
         public void setEmpSumSalVO(EmpSumSalVO pEmpSumSalVO)
              fEmpSumSalVO = pEmpSumSalVO;
    }How all this behaves a runtime can be seen in this screencast
    at http://www.screentoaster.com/watch/stUEpQSkxIR19aSV9YW1NRVF9W/activedataserviceadfbcapp_v0_01_zip_demo
    I would welcome comments on how the example application in ActiveDataServiceADFBCApp-v0.01.zip can be improved, or references to information on how this should be done properly.
    question
    (q1) Where can I find some example code that does use ADF BC and the Active Data Service?
    many thanks
    Jan Vervecken

    Jan,
    ADF BC does not natively support ADF yet. Its planned for a next release. The only Data Control that out of the box support ADS is BAM. To use ADF BC with e.g. databae change notifications you
    - create a shared AM
    - Configure the VO to respond to database changes (check box)
    - Configure the database to broadcast changes
    - Use an af:poll component for the refresh because the update would be on the model layer only
    So what is in the documentation is a doc bug. In the current releae you can use ADS best with a POJO model (that you use directly for dashboard use cases). You can though use a POJO data control, but this at the current stage would just act as a pass through for the data access.
    See example 156 on http://blogs.oracle.com/smuenchadf/examples/ for how to do it with ADF BC
    Frank
    Ps.: Of course, the plan is to make everything working out of the box with no developer action required.
    Edited by: Frank Nimphius on Feb 12, 2010 6:56 AM
    Re-read your post. Maybe I need to revise my comment. Are you accessing AM directly or via the ADF layer. If the latter - I did not yet look at your sample - then this may work if you don't release the AM module you access directly (may not scale well)

  • SAP BO Data Services XI 3.2 - Cannot Handle Multithreaded RFC Connection?

    Hi Guys,
    Just want to ask for your inputs if Data Services cannot handle multiple RFC connection request to BW system?
    The scenario is:
    There is one BODI job using RFC connection and trigger the 2nd job at the same time and it happen that the 2nd job failed.
    Current version of SAP BO Data Services XI 3.2 that we are using is 12.2.2.1
    Thanks in advance,
    Randell

    Arpan,
    One way to get to the multiprovider data is to use Open Hub with a DTP that gets the data from the multiprovider and exposes it as an open hub destination to Data Services. With Data Services XI 3.2 we now fully support Open Hub where Data Services will (1) start the process chain to load the data (2) read the data when process chain ended and (3) notify Open Hub when done so that the data can be purged again.
    More info on Open Hub here : http://help.sap.com/saphelp_nw04/helpdata/en/1e/c4463c6796e61ce10000000a114084/content.htm
    But I will also look into the why we show the multiproviders when browsing the metadata, but get an error when trying to extract using the ABAP method (not via Open Hub). You could be right in your assumptions below and we might just need to hide the multiproviders when browsing metadata.
    Thanks,
    Ben.
    Edited by: Ben Hofmans on Jan 5, 2010 6:06 PM - added link to Open Hub documentation which references multiproviders as possible source.

  • BP data services:Can I config Job server via commandline?

    Hi Experts,
    Can I config Job server via commandline?
    I need to create a new Job server JS_TEST01/port3501 and add the repository to this Job server and restart Job server via command line,like I can create repository via command RepoManBatch.exe
    C:\Program Files\Business Objects\BusinessObjects Data Ser
    vices\bin>RepoManBatch.exe
    Usage:
    -U<User> : Repository login user
    -P<Password> : Repository login password
    -S<Server> : Repository server name
    -N<DatabaseType> : Repository database type
    -Q<Database> : Repository database
    -g : Repository using Windows Authentication
    (Microsoft SQL Server only)
    -t<Type> : Repository type: local, central, pro
    filer
    -c : Repository create
    -u : Repository upgrade
    -v : Repository version
    -d : Show details
    -a : Repository security
    C:\Program Files\Business Objects\BusinessObjects Data Ser
    vices\bin>
    Thanks
    Miller

    我已经装好了SQL server express 和Data service 12.2,我可以通过【Data Services Server Manager-C:\Program Files\Business Objects\BusinessObjects Data Services\bin\AWServerConfig.exe】 来创建Job server [JS_Test01] 和add a repository给这个Job server,现在我的需求是,我希望通过command line在来完成这个动作。
    就好像使用DOS 命令RepoManBatch.exe创建repository一样。
    C:\Program Files\Business Objects\BusinessObjects Data Ser
    vices\bin>RepoManBatch.exe
    Usage:
    -U<User> : Repository login user
    -P<Password> : Repository login password
    -S<Server> : Repository server name
    -N<DatabaseType> : Repository database type
    -Q<Database> : Repository database
    -g : Repository using Windows Authentication
    (Microsoft SQL Server only)
    -t<Type> : Repository type: local, central, pro
    filer
    -c : Repository create
    -u : Repository upgrade
    -v : Repository version
    -d : Show details
    -a : Repository security
    C:\Program Files\Business Objects\BusinessObjects Data Ser
    vices\bin>

  • ColdFusion Remote Service Calls and Dates

    Hi all, I'm hoping someone can help me out with my confusion.  I’m sending an object including datetime data from ColdFusion to a Flash application.  The datetime in the ColdFusion object include milliseconds, however, if I use the getTime() function on the Date object in the Flex application, the milliseconds have been zeroed out.  Is there any way to retain the milliseconds when the data is passed through the flexgateway?
    Thanks,
    Christine

    Hi Balaji,
    I have attached a demo of my proxy webservice called pxTest.cfc.  pxTest.cfc contains two functions, testDate and pushBackTestDate.  I also attached a simple Flex App called testDate.mxml.  And a Test file called test.cfm.
    Save pxTest.cfc to your CFC folder.  Create a new project called TestDate.  Go to the data/services tab, click "Connect to Data/Service" and select Data Service ColdFusion.  Browse to PxTest.cfc and select it for the CFC location.  Click Next, then click Finish.  Configure ReturnType as Date for both pushBackTestDate and testDate.  Paste the code from TestDate.mxml into the blank TestDate.mxml file that was created when you created the project TestDate.  Run the TestDate.mxml application.
    On load of the application a date was requested and received from pxTest.cfc.  Click Confirm Test Date and note the date returned, and the milliseconds value for that date.  Next click Push User Data back, this will call the method pushBackTestDate sending the date from the Flex app back into CF.  All pushBackTestDate does is set the value returned into session and return it back to the Flex app again.
    Click the button Show Pushed Data Result to see the date that was returned.  In my tests this date no longer has a value for milliseconds.  As long as pxTest.cfc and test.cfm run in the same session you will be able to see the dates sent and returned by running test.cfm.  The dump of that session variable and time formatted versions of that variables are displayed below:
    RETURNED
    {ts '2009-09-17 12:57:50'}
    SEND
    {ts '2009-09-17 12:57:50'}
    TimeFormat Send 12:57:50 13
    TimeFormat Returned 12:57:50 0
    Note the date sent contains milliseconds, the date returned does not have milliseconds.  Hopefully this demo will work for you and demonstrate the issue.  Please let me know if you need more information.
    Thanks,
    Christine

  • Data Services job server crashed and won't start backup

    Hello,
    I was running some jobs on data services 4.2 sp3  windows server 2012R2 and they all failed and the job server went down. None of the jobs that failed had an trace file or error log in the management console. Now i am unable to open data services designer or data services server manager, when I try to open them nothing happens. Also the SAP Data services job service cannot be started. The job server was running fine for a few weeks before this. This has happened twice already today the first  time the only way i was able to fix it was run the repair on the dataservices install. Can someone please help me what know what is causing this and how it can be fixed.

    Hi Tyler,
    It was Windows specific issue please refer the below link & KBA
    How To Fix Windows Service Error 1053
    http://windows-exe-errors.com/how-to-fix-windows-service-error-1053/
    1986247 - Error "Windows could not start the BusinessObjects Data Services service on local computer" occurs in Data Services 4.1
    https://service.sap.com/sap/support/notes/1986247
    1992260 - Error: Windows could not start the SAP Data Services service on local computer, after upgrading SAP data services and deleting job servers SAP Data Services 4.2
    https://service.sap.com/sap/support/notes/1992260
    Hope this will help!!!!
    Thanks,
    Daya

Maybe you are looking for