OPEN HUB SCENARIOS & DS Replication

HI all,
  1. plz any give me some scenarios for the open hub. y v retract the data from the data targets to other places.
2.when v replicate the DS( data source ) for the first time do v get the data along this r not. i am confuse plz help me out.
I will assign points
Thanks in Advance
priya

Hi Priya,
   Open Hub   scenario  means...  we need to retrive the BW system data to  external system.. see for example..., we extracted the data into external system for further processing like <b>merging the datasets</b> from other non-SAP systems, business validations etc.). Then, the final dataset is loaded into BW.
      The Open Hub service enables the extraction of data from a SAP BW system into external data marts, analytical applications, and other applications. This ensure controlled. The central object for the export of data is the InfoSpoke. With this it’s possible to define the object from which the data comes and into which target it is transferred.
          Through the open hub service, SAP BW becomes a hub of an enterprise data warehouse. The distribution of data becomes clear through central monitoring from the distribution status in the BW system.
regards
@jay

Similar Messages

  • SAP BI 7.0 to SAP PI to FTP and with Open Hub Destination ...Help!!!!

    Dear SCN Experts,
    I am currently working on scenario where I have a requirement to push data from SAP BI 7.0 to SAP PI 7.
    And I am using Client proxy for SAP BI to send the data from BI tables to SAP PI and then to write to FTP address.
    Now the challenge I am facing is the ABAP Proxy how to use it with Process Chain and also I am having Open hub destination created for the same.(Specifically with the new version i.e. 7.0)
    Can you atleast make me understand what are the steps involved in this for Client Proxy, Process Chaing, How will proxy trigger and other regarding the same.
    I have searched SDN but got the document having older versions in it. Which doesn't serve the purpose.
    Regards,
    [Gaurav Patwari|http://gauravpatwari.wordpress.com]

    Hi Michal,
    Thanks for the reply buddy.
    I know that we can run the scheduled report for the proxy to fetch the report. But the client requirement is to use process chain, Open hub destination which fetches data from the ODS to Ztable created.
    We need to fetch that data from the table via our proxy. I am familiar with that report related method.
    I have one document using the same method but is using Infospoke(which is now obsolete) so have to OHD and also some of the proxy of XI's older version. So is not helping me out.
    Please do the needful. Or can you send me some sample scenario like this with screen shots. It will be a great help.
    Regards,
    [Gaurav Patwari|http://gauravpatwari.wordpress.com]

  • Inconsistency with ZPMSENDSTATUS for Open Hub Destination

    Hi Experts,
    I dont know you have come accross this scenario but I will apprecaite all your advice.
    We have APO 7.0 and we are using the BW component to extract data to informatica.
    Our environment is as follow
    SAP SCM 7.0 SP8
    Informatica PWCD 8.1.6 hot fix 13
    1.we have created 5 chains with similar variants for ZPMSENDSTATUS
    2. Each variant relates to a folder and workflow to be executed within informatica
    3. When we execute the chains only two work correctly all the time and 3 fail
    the 3 that are failing are failing within the ZPMSENDSTATUS when running API RSB_API_OHS_DEST_SETPARAMS. This is odd since the other two only differ on the workflow to be executed and obviously on the Open Hub destination. Can anyone provide some advice on this?
    thanks,
    Nat

    Hi Maruthi,
    thanks for your response but here are the details for the variants that are working and failining
    working:
    DEST: < here we have the RFC that relates to the logical system and the program id which is also included in saprfc.ini>
    INFPARA: AERO/APO:WF_SAP_APO_PART_PLATFORM_ABP2_BGA:S_M_SAP_APO_PART_PLATFORM_ABP2
    CONTEXT: OHS API
    OHDEST: <OHDS defined as target>
    Failing:
    DEST: < here we have the RFC that relates to the logical system and the program id which is also included in saprfc.ini>
    INFPARA: AERO/APO:WF_SAP_APO_PART_PLATFORM_ABP4_ATR:S_M_SAP_APO_PART_PLATFORM_ABP4
    CONTEXT: OHS API
    OHDEST: <OHDS defined as target>
    So you see, they all target the same folder only different workflows. Still it works for the first and not for the second.

  • Issue in transport regarding Open Hub Services

    Hi All,
    I have a strabge scenario, where I am moving one of my open hub related transport from system test env to UAT,but each time it's failing.
    The story behind is: I have a infospoke based on one ODS,where I have included a field called BRAND in the source structure.
    This transport went fine from Dev to Sytem test env,but it's failing while moving from System test to UAT.
    Error is.
    Program ZCL_IM_ZLP_WRKF===============CP, Include ZCL_IM_ZLP_WRKF===============CM001: Syntax error in line 000085
    The data object 'WA_SOURCE' does not have a component called '0IS_CLAIM__BRANDVALE'.
    Any comment will be appriciated.
    Regards,
    Kironmoy Banerjee.

    Hi,
    Generally transports fails with different situations like, by sequence and dependecy missing / with in active objects/ with RFC issue.
    so, we should follow the sequence and dependency while transporting the objetcs and they should be in active state in the source first.
    In your case, check your source fields whether they mapped correctly or not and it should be in active state.
    as per your error:
    The data object 'WA_SOURCE' does not have a component called '0IS_CLAIM__BRANDVALE'.
    Check source "'WA_SOURCE", why does not have the component called "0IS_CLAM_BRANDVALE". I think the souce missing the component called ""0IS_CLAM_BRANDVALE".
    see whether it was mapped correctly/active or not  and follow the seqence if there are any dependancy and retransport the request.
    Regards.
    Rambabu

  • Query regarding Open Hub Services

    Hi All,
    I am trying to find out if delta extraction is possible through open hub services to a cube which is running daily full load?
    To make the scenario more clear:
    I have a cube  A which is getting loaded on a daily basis (FULL LOAD) but before loading it's deleting the whole data from the cube & loading fresh full load.....this is to get the updated data...
    Now I want to set up InfoSpoke for this in delta extraction mode, is this possible???
    It would be of a great help if someone can explain how delta mechanism work in open hub services. Any suggestion would be highly appreciated.
    Regards,
    Kironmoy Banerjee

    Can anyone tell me how to extract the master data attached to a field through InfoSpoke?
    Ex: I have a field called OIS_BP in one of my InfoCube, I have created an InfoSpoke where I have mapped this filed, but when it extracts data, it will be like 0IS_BP = 1100022, or1100023, 100024 etc, we can not see he master data assigned to it, I mean we can't see the name or address of BP No. 1100022 or 1100023 or 1100024.
    Can any one tell me how to add the names with these BP in InfoSpoke?
    Regards,
    Kironmoy Banerjee.

  • Open Hub External System

    Hi,
    I have a basic requirement to extract data out of Open Hub into an external Oracle database. I have read previous threads online on this topic, but still have a couple of questions that are not clearly answered.
    Firstly, my client complies with all licensing arrangements to use Open Hub.They do not want to use flat files and they do not have access to tools like the Microsoft Connector described in many online blogs. 
    Secondly, my client would like BW to kick off the process chain, load into an internal BW table, and then raise a trigger to the external Oracle database to come and "pull" the data from the newly created table. I am new to working with API's, but I read about standard API's been available to achieve this, but I would like some simple clarrification:
    - In my Process Chain, after I load the Open Hub data into the internal table, I want BW to use the API RSB_API_OHS_3RDPARTY_NOTIFY to tell the external system it's ready to pull the data. Reading the docu about this API, it seems like this needs to be initated from the external Oracle database. Is this correct (I would hope BW could do this) ? And if so, I dont know if my client has these API's on their Oracle system. How do I ensure they do ? I imagine they are freely available, but since I am not a DBA, and do not have access to someone who can tell me or not, I would like to know what the standard procedure for API communication is here. An example or white paper showing the API (with parameters) for a real life scenario would be ideal.
    - If the above does not work, even though they say the Microsoft connector tool can be used for Oracle, I am hesitant to use a Microsoft tool, when surely Oracle has their own free tool to do this - or can the Microsoft Connector integrate directly with an Oracle database ??????
    - When we refer to the term "Third Party Tool", does this mean a ETL tool like Informatica, or a connection tool like the Microsoft Connector, or the external database like the Oracle database in this example ?
    Any clarrification with the above would be most appreciated. I do not have access to a BW system yet, so I would like to know some of the details before I propose a potential solution.
    Thanks,
    BWPerson

    Hi
    Here are some documents:
    http://help.sap.com/saphelp_nw04/helpdata/en/ce/c2463c6796e61ce10000000a114084/content.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9
    http://www.bwexpertonline.com/archive/Volume_01_(2003)/Issue_05_(May)/V1I5A5.cfm?session=
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5f12a03d-0401-0010-d9a7-a55552cbe9da
    http://help.sap.com/saphelp_nw04/helpdata/en/e3/e60138fede083de10000009b38f8cf/tree.htm
    http://www.thespot4sap.com/forums/m_30/mpage_1/key_/tm.htm
    hope it helps
    Fil.

  • Open Hub Destination Delta issue

    Hi experts,
    We have a scenario in which a Open hUb is created on top of a cube.
    Regular delats are moving from this cube to Open hub daily.
    But as per the request, we had to dump the cube data, and the reload took, 2 days to complete.
    So, that 2 days, data got missed in Open hub.
    Now, as the new request are not datamarted, next time wen the DTP between Cube and Open Hub runs, it will fetch all the requests right. we dont want that.
    So, can we run a init w/o data transfer so that source status will be fetched?
    but then how to load that 2 days data alone.
    Kindly advise.
    Edited by: neethacj on Mar 7, 2012 1:06 PM

    If the cube doesnt have any other targets aprart from this OHD,
    delete the two days requests in the cube, set the init from cube to ohd
    load the cube for delta( brings two days requests) and run the ohd

  • Populate text description in the open hub destination

    Hi SDNers,
    I have requirement with open hub destination, i am taking the cube data to SQL server database.
    in that scenario i need text description in the cube ,but we can not see the text description data in the cube(because cube can hold dimension data and utmost navigational attributes data but no textual description).
    Text data lies out side the cube but i need couples of text description for master data texts.
    Do i need to write the code to populate(look-up) the text description from text table.
    Can any one suggest an idea.
    Thanks,
    Satya

    Yes, you may want to write a code on transformations from Cube to Open hub to go and get text from Text table of that Info object.
    Make sure to create a new IO of lenght equal to Text field in IO and add this new IO in the Open hub and in transformations write a code at field  like:
    select single TXTSH from /BIC/Txxxxx into wa_text where
    key_of_p_table = /bic/source_field
    and langu = 'EN'
    If sy-subrc = 0.
    result = wa_text.
    endif.
    clear:wa_text

  • SAP Open Hub Vs SAP JCO for Extraction

    Hi,
    I would like to clarify for SAP JCO connectivity from some third party tool to SAP BW Server in compare to open hub:
    We want to connect our internal third party tool to SAP BW/BI server by using Java technology through SAP defined JCO. Later we will create RFC and BAPIs at SAP BW server for data extraction. Those defined BAPIs and RFCs we would call from my java programs.
    In such kind of scenario we can extract metadata as well as data for all defined data targets (InfoCube/ODS/DSO/InfoObject) and also can make otherdata load scheduling program to extract data and store it at my non-SAP application.
    As per my understanding since we are using SAP provided JCO so there won't be any issue of license for the same. And Open hub require license if it is used to extract and send data from data targets to third party tools.
    Can someone confirm for license out of above mentioned cases when exactly it would be required. Mainly I would like to know if we doun't use SAP provided JCO for connectivity and data exttraction then license woould be required or not?
    You speedy response would be highly appreciatted.
    Regards,
    Vivek

    hi,
    refer this links
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b6d59590-0201-0010-5f9d-adb9f6b14d80
    help.sap.com/bp_biv270/documentation/SAP_BW_3.5_Functoin_Detail.pdf
    www.sapbasis.ru/library/ep_tech_and_program/ep_tech_and_program-7_8.pdf
    thanks
    naresh

  • InfoSpoke ( OPEN HUB SERVIVE)

    Hi Everybody,
    I would like to use an InfoSpoke to share aCube Information with a CRM System.
    I will try to explain the scenario I would like to achive..
    I want that the CRM system checks every defined time if there is information to move into its system. Once the CRM System has picked up (or collected) the information from BW , CRM has to initiate a process to deleted the Information in the InfoSpoke.
    1)
    It is possible to do that, and how could CRM people to "program" something that do that?  is there something standard (I mean ...that one does not need to write a program..)that do that on the CRM side?
    2)
    It is possible to enhance the InfoSpoke structure on the BW side?? I would like to add a SYSDATE to the structure just to write on each record when it was written (created) on the InfoSpoke...
    I am going to appreciate any idea and comments.
    Regards,
    FedeX

    Hi Fedex,
    1)
    It is possible to do that, and how could CRM people to "program" something that do that? is there something standard (I mean ...that one does not need to write a program..)that do that on the CRM side?
    You can generate files into an application server and every time it will ger overwritten if file name is same. Why you wannt to delete the files...??
    2)
    It is possible to enhance the InfoSpoke structure on the BW side?? I would like to add a SYSDATE to the structure just to write on each record when it was written (created) on the InfoSpoke...
    You can add the new field in BADI by enhancing target structure(adding new field) and populating using ABAP code in BADI.
    More info:
    BW Open Hub (ppt)
    https://websmp103.sap-ag.de/~sapidb/011000358700002201112003
    Hope it Helps
    Srini

  • Open hub,infoset

    Hi all,
             can any one tell what is open hub services and infoset.........is it used in projects ,in which type of scenario they are used.
    Thanks in Advance.
    jeevan.

    Hi Jeevan,
    Please before posting a new question...you can get lots of good information.

  • Open Hub licence

    Gurus
    I am trying to use open hub service in BI 7.0 and in these case need to distribute data to other non-sap systems.
    so do i need to have license for this ???
    thanks

    If you extract data from your SAP NetWeaver BI to a non-SAP system, you have to license this scenario - no matter which tool use for the extraction, e.g. Open Hub Destination, listcube, any BI API, Analysis Process Designer or Excel download.
    Note that a licensed BI user may download to Excel for her/his own analysis purposes without any further license costs. Distributing Excel workbooks to non-SAP BI users for personal reporting purposes requires Information Broadcasting license, not Open Hub license.
    Hope it Helps
    Chetan
    @CP..

  • License requirement for Open Hub?

    Hello,
             We're using Open Hub extensively in our project which is in the Q phase. It has been working fine. we're in NW 2004s BI 7.0.
    Question is do we need to have any license to run this?, came across an old SAP presentation that mentioned that this would require license.
    Any SAP people or those running Open Hub in production let me know.
    thanks
    Mayil

    Hi Mayil,
    Check this PPT :
    https://websmp204.sap-ag.de/~sapidb/011000358700002201112003#398,1,Open Hub
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9
    http://help.sap.com/saphelp_nw04s/helpdata/en/43/58e1cdbed430d9e10000000a11466f/frameset.htm
    follow this link for scenarios...
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/5f12a03d-0401-0010-d9a7-a55552cbe9da
    cheers
    Sunil

  • Creating multiple flat files from a single source using Open Hub

    Hi All,
    I am not sure if its a common scenario but we have a requirement where in we have to split the data in the target into 3 different flat files depending on certain conditions. For e.g.
    If the value of an attribute in a record is A then I have to create a file, if it is B then another file and if it is C, then another file.
    Is this possible to achieve through open hub? We have a requirement to load only delta records. Any inputs is appreciated and would be rewarded.
    Thanks and Regards,
    Sundar

    Yes it is possible,
    All you need to do is create 3 destinations each of type Flat file.
    For your data target link to the and Open Destinations1, link by creating a transformation. and while create DTP use filter A.
    Again for your data target link to the another Open Destination2, link by creating a transformation and while creating DTP use Filter B.
    similarly for Open Destination3,  use Filter C.
    Regards,Vj

  • Open Hub fetching data from Change log table

    Hi Gurus,
    I have a scenario where in I have an Open Hub destination whose source is a DSO, and we are sending data into SAP Data services. We found that the data being sent is from the change log table. Is this how an Open Hub with DSO as a source works ? Or is there any possible way we can send data from the Active table of the DSO, when the DTP is run.
    Kindly help. Thanks in advance !
    Regards
    Snehith

    Hi,
    For Extraction from the DataStore Object, we have the following options for Delta init Extraction:
    Active Table (with Archive)
    The data is read from the active table and from the archive or from a near-line storage if one exists. You can choose this option even if there is no active data archiving process yet for the DataStore object.
    Active Table (Without Archive)
    The data is only read from the active table. If there is data in the archive or in a near-line storage at the time of extraction, this data is not extracted.
    Change Log
    The data is read from the change log of the DataStore object
    We can select any of these options for data trasnfer.
    Regards,
    Geetanjali

Maybe you are looking for