SLT through BODS

Hi All,
Can we do SLT from ECC to HANA through BODS?
All help is appreciated.
Thanks
Sam

Well, no one can ever accuse me of not admitting my mistakes. 
Data Services  4.2 SP1 can consume CDC data delivered via SLT (although it is via ODP).
An excerpt from the what's new... 4.2 SP1 what's new - http://help.sap.com/businessobject/product_guides/sbods42/en/ds_42_whats_new_en.pdf
SAP LT Replication Server integrationData Services has been enhanced to integrate with SAP LT Replication Server (SLT) by leveraging the new version of ODP API. The exisiting extractor interface in Data Services has been enhanced and replaced with ODP in the Object Library and Metadata browsing. ODP allows uniform access to all contexts provided by ODP API.
In case the new interface is not available on the SAP system, the old extractor interface (ODP or native) will be used.
A new option has been added to the SAP datastore: “Context ”. SAP datastore in Data Services has been enhanced to support the SLT objects. Working with these objects is similar to the way Data Services users work with SAP extractor today. The ODP context allows you to connect to both the extractors and the SLT.
For more information about ODP in SAP Data Services, see the Supplement for SAP.

Similar Messages

  • Load HANA Data to Netezza through BODS Job

    Hi Experts,
    I am new to BODS and Netezza. I have a requirement loading HANA table data into Netezza table through BODS job.
    We created data stores for HANA and Netezza. And we have created Netezza table same like HANA source table.
    I don't have any query transform logic in job .Its a direct 1:1 mapping.  But When i execute the job, i am getting an error, says" Cross Database Access not supported for this type of command".
    I don't have any idea about this error.
    Please share your thoughts and idea's if anyone faced similar issues.
    Thanks,
    Best Regards,
    Santhosh.

    Hi Manoj,
    Thanks for your reply.
    Finally after making some try to the Netezza table properties(Bulk loader Option tab), now the job is executed successfully.
    Here is the Netezza table property details:
    Thanks,
    Best Regards,
    Santhosh Kumar.

  • Process unicode and xlsx files through BODS

    Dear Experts,
    Could you please help me with the following scenario:
    System: BODS 3.2 on Linux Server
    Our clients want to send their source data in "xlsx" and "unicode" files created in windows. Does BODS 3.2 or any higher version on linux process these file types?
    Thanks,
    Santosh

    Dear Experts,
    Can anyone help me out with the Unicode as well. I found that linux only process file of character set UTF-8 and since the unicode file create on Windows is of Unicode UTF-16, BODS 3.2 on linux cannot process it, I am assume that this is a linux issue and not BODS.
    Could someone help with any solution or work-around
    Thanks,
    Santosh

  • RDS for SAP data migration through IDOC's

    Dear Experts
      Just now we have installed BODS 4.1 and RDS also installed now. I am in the process of exploring the RDS. I found all the migration objects is available in the form of IDOC's. Now my questions are
    How to work on custom IDOC's. Do i need to extensively code on R/3 side to upload the custom fields. Like user exit or BADI for custom fileds
    If yes then i would say implementation time through RDS will remain more or less same time.
    Through BODS or by RDS migration objects is there any way without coding on ABAP R/3 side, Can we achieve migrating custom fields
      Please let me how to achieve above requirements by RDS
    Thanks
    Vijay Mukunthan

    Hi Vijay,
    You are right, the standard content of the Rapid Data Migration RDS packages brings the content with the standard IDoc interface for SAP standard only. So custom objects are not pre-built with the package. However, we incorporated the foundation to include custom coding as easy as possible:
    1) IDoc
    We are using the standard interface technology IDoc. IDocs do have an enhancement concept which is described in the online help. There are three main cases and all of them are supported, however two need additional coding:
    Z-fields: Additional fields in existing tables lead to IDoc segment extensions
    With the IDoc segment versioning concept this is easy to realize without any ABAP coding
    Z-tables: Additional IDoc segments will be necessary and lead to a new IDoc version
    With the IDoc versioning concept you can easily add Z-segments with leveraging provided user exits
    Custom business objects:
    You can even create a complete own IDoc message type in Z-namespace for own stuff, leveraging the given ALE layer of the IDoc middleware
    2) Enhancement guide
    With the RDS package we offer an enhancement guide which helps you to modify the jobs and data flows in the SAP Data Services content according to the changes in the target IDoc structure to reflect additional fields and tables. We built it as a step-by-step guide following for a sample IDoc type but will work similarly for any IDoc.
    Best regards,
    Frank Densborn.

  • SAP BODS Licensing

    Hello Everyone,
    Its a very specific question about SAP BODS Licensing.
    Version can be 3.1 / 4.0 /4.2
    I tried to find this information from different sources in Google but not able to get any proper information.
    Actually we are trying to figure out how many license we need.
    The plan is :
    EnvironmentInfrastructure capability
    Development(20 repositories)·         15-16 developer repositories.
    One per developer.·        
    One central repository for version control.·        
    3-4 backup repositories for any ad-hoc needs.
    =======================================
    Test - Component Testing(4 Repositories)·        
    2 main repositories for executions.·        
    2 backup repositories for any ad-hoc needs.
    ===========================================
    Test - Integration Testing(6 Repositories)·        
    3 main repositories for executions.·        
    3 backup repositories for any ad-hoc needs.
    ===========================================
    Test - Dress Rehearsal(4 Repositories)·        
    2 main repositories for executions.·        
    2 backup repositories for any ad-hoc needs.
    ============================================
    Production Environments(2 Repositories)·        
    1 main repository for executions.·        
    1 backup repository for any ad-hoc needs.
    Thank you

    All the comments above are helpful.
    But still I am stuck in few places to define the licensing for BODS
    1. If we say it depends on CPU core, then how to define how many cores will be required?
    2. Is the calculation is on volumentrics of data that will pass through BODS, if yes what will be the calculation?
    3. If I derive the number of cores, how to derive the  number of CPU licenses for BODS?
    ***NOTE: Is there any formula to calculate***

  • Error in BODS and SAP connectivity

    Hi,
    We are trying to fetch the data from SAP thru ETL tool BODS; while doing so we are facing the following issues :
    1.        While connecting to SAP RFCu2019s through BODS, we are getting the following error
    u2018RFC CallReceive error <Function RFC_ABAP_INSTALL_AND_RUN : You do not have authorization for this functionu2026u2026>  (BODI-1112339).
    As per our understanding, this requires the privilege to execute the function RFC_ABAP_INSTALL_AND_RUN. ??
    2.        Also, we require to load the BODS functions(related to SAP)  on  SAP Server using CTS (Correction and Transfer System); as per help document  given in technical manual of BODS. We are getting the error while importing the component in BODS as these functions are not available in SAP.Getting error like function module Z_AW_TABLE_IMPORT not found
    We are following the instructions and we have the required files (R900XXX.SXX  and K900XXX.SXX). But we do not find /usr/sap/trans/data and /usr/sap/trans/cofiles directories in SAP server.
    How to achieve this in BODS ??
    Also can you tell one thing that can we talk to SAP from BODS without importing the tables and metadata in BODS datastore ??
    Urgent help required.
    Thanks in advance
    Jagadesh

    Hi
    Experts..even i have the same requirement.But i am doing it wrt Master data.
    There are two options:
    Option - I: Your source data should give updated date and time(which is the case in my scenario) In the PSA you always load the Full data and in the start routine compare the data with the last update date and time and accordingly modify your data in teh target and then set the date and time to the last updated date and time.
    Option - II: In the data service load job use another table called mirror table to and always compare the new data with the mirror table and send to BW only the changed records(Delta).
    Btw i have an issue in loading hierarchy data to SAP BW.I have the same structure in the MSSQL but when executing the loas job it throws the following error :
    Error while executing the following command:  -Sd1 return code:75
    Thanks!

  • SLT on sandbox Hana server 01- possible?

    Guys,
    Is it possible to test SLT or BODS on Hanasvr-01 ( provided as sandbox access by SAP)? If so, can you please let me know how to use it ?
    If not, can you point me to some documentation of using SLT/BODS 4.0.
    Thanks

    Hi HANA_FOLLOWER,
    At the Moment, neither SLT nor Data Services are configured in the Developer Center HANA systems. We have mentioned it several times, and I repeat it again: we can add replication to our sandbox systems if the community demands it. So reply to this thread if you are interested in trying SLT or DS - if enough users want it, we'll add it. How many are "enough"? Hard to say, but I would hope for at least 5% of active users. As of now, this would mean about 25-30 votes...
    Regarding documentation: the [SLT installation guide|https://websmp209.sap-ag.de/~sapidb/011000358700000604912011] on [http://help.sap.com/hana] contains some information on how to use SLT with HANA in chapter 5. And for data services, just use the normal DS documentation, e.g. the [DS admin guide|http://help.sap.com/businessobject/product_guides/boexir4/en/xi4_ds_admin_en.pdf] - for DS users, HANA is just another ODBC data source.
    hth
    --Juergen
    Edited by: Juergen Schmerder on Dec 19, 2011 3:17 AM

  • BW Infopackage transport for BODS Interafces

    Hi Experts - I am writing the data in SAP BW from Oracle source , Through BODS 4.0.
    While Executing the BODS job , It creates the DI-generated Info-package automatically in BW system.
    Now we need to transport the BW objects in BW Quality system , What should I do with the DI-generated Info package ?
    Either not to transport them and crate new one in quality system , Or transport them and change the BODS parameter manually in BW quality system to work with BODS quality system.
    Another challenge will be , How to include them in process chain ?
    Thanks
    R

    Hi Rohan,
    You can tell Data Services which data package it should use in BW. DS is happy to use any existing datapackage, but if no datapackage exists (or not with the technical name specified) DS will generate a new one.
    Have a look at the datasource properties in the datastore for your SAP BW target.
    You will find the InfoPackage (technical) name by right clicking the transfer structure in the datastore > properties > Tab Attributes. You can simply enter a value in the InfoPackage_Name field. Make sure to use the technical name.
    So, you can transport the infopackage, and use it in a process chain, no problem.
    Just, when you transport it, double check if the technical name in the QA/PROD system is the same..
    If not you'll have to change it in the datasource/transfer structure properties in your BW datastore.
    Jan.

  • Problem with BODS and TERADATA bulkload Utilities

    Hi,
    While loading data into VARCHAR column by using MLOAD/FLOAD Bulkload utilities, we have observed that NULL values are getting loaded as BLANKS.
    It seems that this issue is happening only with CHAR/VARCHAR data. With all other data types like DATE or INTEGER, NULL values are getting loaded as NULLs only.
    I should have used "INDICATOR" if I am loading data through MLOAD Script directly, to handle this issue.
    But through BODS, I am not sure how I can handle the conversion of NULL values to BLANKS for CHAR/VARCHAR datatypes with MLOAD/FASTLOAD utilties.
    Any input on this issue is much appraciated.
    Regards,
    Sudhakar

    I had similar problem while migrating in one of my earlier projects. Hope you have tried following
    Full Rebuild of Index
    Install Archiver Replication Exception component
    Finally you can export part by part based on say Content Type and import on your QA.
    regards,
    deepak

  • Template file - EPM.hdbdd

    Hi Experts,
    I am not able to find the demo file EPM.HDBDD in the demo content.
    Can you please guide and provide the sample code of all Various Tables in 1 file.
    Thanks

    Hi Wenjun,
    Thanks.
    However i am not able to find SHINE folder anywhere in HANA Studio. Probably not all the content has been installed here.
    Also i would like to have your suggestion in knowing that we have recently started with 1 of HANA Implementation  and need to clarify that whether is it recommended to start development following various suffix notification i.e. hdbdd , hdbview , hdbsequence etc. in Project explorer in Development perspective or one can start directly in system view in Modeler perspective by creating this directly in the respective Package.
    Assume my scope lies in that all the sap tables would be replicated into HANA via SLT or the relevant data source which can be done through BODS would create table into hana directly.
    Please suggest keeping  Life cycle management , Transport management etc into consideration.
    Your guidance would be highly appreciated.
    Regards
    Kamal

  • Standard Sql scripting Vs CE built-in functions ? Which one should I need to choose?

    Hello,
    My source is csv files and I need to load these source files into HANA after doing some Transformations based on multiple validations (for example , based on grade, some new column has to be populated with some relevant hikes i.e. need to use if/else logic, case logic, either insert or update in target table …etc. This validation should be done for each and every row).
    Assume some of the validations not possible through BODS and I need to implement this logic in HANA scripting (through procedures).
    I know HANA supports two kinds of writing scripts i.e. HANA standard sql statements and CE_functions and I’ve heard from HANA documentation that CE functions gives more performance than standard sql statements and we should not mix together in a single block of code.
    Please let me know which scenario should I go for?  (I doubt if we can do all the functionalities using CE functions?
    I am looking forward to your reply.
    Thanks,
    Sree

    Much awaited reply.
    Thanks a lot Jain !
    But just one more point to bring out, if some of requirements can be done through both CE functions and Scripting,  and some are possible through only scripting then we need to go for sql scripting only ..as we are not supposed to bring both the coding standards together, results in performance issues. ( Even your diagram shows either CE functions or Use SQL)
    am i right?
    Thanks,
    Sree

  • Error in Designer Studio Environment:

    Hi,
          I installed the BODS client 4.0 on Windows XP SP3 32 bit(RAM 3GB) without any error and restarted and when we are starting the Data Services Designer it is giving the following error.
    Error in Designer Studio Environment:
    System call <LoadLibrary? to load and initialize DLL or shared library functions failed <ds_crypto.dll>.Ensure that the library is installed and located correctly.
    Applicatio will be terminated (BODI-120012)
    Can anyone suggest what could be the issue ??What is the solution for this?

    Hi,
    We are trying to fetch the data from SAP thru ETL tool BODS; while doing so we are facing the following issues :
    1.        While connecting to SAP RFCu2019s through BODS, we are getting the following error
    u2018RFC CallReceive error <Function RFC_ABAP_INSTALL_AND_RUN : You do not have authorization for this functionu2026u2026>  (BODI-1112339).
    As per our understanding, this requires the privilege to execute the function RFC_ABAP_INSTALL_AND_RUN. ??
    2.        Also, we require to load the BODS functions(related to SAP)  on  SAP Server using CTS (Correction and Transfer System); as per help document  given in technical manual of BODS. We are getting the error while importing the component in BODS as these functions are not available in SAP.Getting error like function module Z_AW_TABLE_IMPORT not found
    We are following the instructions and we have the required files (R900XXX.SXX  and K900XXX.SXX). But we do not find /usr/sap/trans/data and /usr/sap/trans/cofiles directories in SAP server.
    How to achieve this in BODS ??
    Also can you tell one thing that can we talk to SAP from BODS without importing the tables and metadata in BODS datastore ??
    Urgent help required.
    Thanks in advance
    Jagadesh

  • Delimiter in the Source Data

    Hi All,
    We are using BODS 3.2 on linux.
    How should we handle when a delimiter is present in the source data.
    For example:
    We have CSV file as the source, The file has comma ',' as part of the data in one of the columns. When we execute the job it throws an error :
    "A row delimiter was seen for row number <1> while processing column number <n> in file"
    How do we handle these ( if a column delimiter is present in data)?????

    Hello
    I am facing the same issue, actually client doesn't have any idea about his data.
    How should we handle when a delimiter is present in the source data (Flat File) through BODS.
    We are on SAP BW 7.4 & BODS 4.2.
    Any solution on same…? Its very argent.
    Thanks in advance

  • Loading into Nested XML target

    Hi all,
    I need to load the data from a flat file into XML file(which has a nested structed).
    How can we achieve this through BODI?

    Yes, you can definitely do this through the NRDM (Nested Relational Data Model) feature in Data Integrator/Data Services.
    The standard documentation contains information on how to do this (check the Designer Guide >> Nested Data chapter) or have a look at our [Data Integrator Tips & Tricks wiki,|https://www.sdn.sap.com/irj/scn/wiki?path=/display/bobj/businessobjectsDataIntegratorTipsandTricks], here the section that covers NRDM : https://www.sdn.sap.com/irj/scn/wiki?path=/display/bobj/theNoneRelationalDataModelNRDM

  • 3d modeling & AE

    I'm finishing a project & want to sharpen it with FX. Things like muzzle flashes, bullet hits, lasers, round ejecting, spacships, swords going through bodies and the like. I've found very little information on what 3d modeling software will work with AE CS3. Can anyone give advise on 3d modelling software that can be used for integation in AE for use in PPRo CS3 projects.

    There are many 3D applications that will allow you to export camera, light and position data to After Effects as real AEC data or as RLA / RPF data.
    However, CS3 does not support 3D objects.
    Here in Hollywood, all of the 3D motion graphics designers I know use
    CINEMA4D with After Effects because it sends out a real AEC file.
    You still need to render all of your 3D animations.
    Also check
    Video Co-Pilot for a whole section about using different 3D with AE.

Maybe you are looking for