Open Hub - SAP's future direction

Is anyone aware of the direction that SAP has planned for the future of Open Hub ? Would appreciate your feedback.

Is anyone aware of the direction that SAP has planned for the future of Open Hub ? Would appreciate your feedback.

Similar Messages

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • APD / Open Hub Reports Execution Steps

    Hi Friends,
                     Can anybody let me know how can I execute the APD and Open hub rerports .. and direct it to logical file names...and where do we configure the logical file names being directed to physical directory locations ?
    thks,
    Suba.

    For APD
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f06353dd-1fe3-2c10-7197-dd1a2ed3893e?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a019563e-bae8-2c10-0abf-b760907630e9?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d0b7f862-350f-2d10-c6a6-b8bff98d71e7?QuickLink=index&overridelayout=true
    For Open HUB
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d0b7f862-350f-2d10-c6a6-b8bff98d71e7?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/5092a542-350f-2d10-50bd-fc8cb3902e2e?QuickLink=index&overridelayout=true
    Regards,
    Sushant

  • Future direction of User Provisioning Tools ( GRC CUP or IDM)

    Hi Security Colleagues,
    We all know that SAP has GRC CUP(Access Enforcer) and NW IDM for provisioing.
    We can use either of toll for user provisioning.
    Based on your experience , what is the best tool ? ofcourse ,It changes from one company to other depends on requirements.
    I am noticed that  lot of SAP devlopment activity going on around IDM.
    Based on SAP's future direction, what is the best tool ?
    Its a common problem for most of SAP customers as SAP is giving IDM freely as part of NW license.
    please share your thoughts..
    Thank You.

    For Futuristic product availabliliy, I always prefer the following two places to check. Can you please also check their?
    http://service.sap.com/pam
    http://service.sap.com/scl
    Check the following Two points under the 2nd Link:
    Scenario & Process Component
    SAP's Release Strategy
    Now based on your query I will also stick to the suggestions given in the Other two posts. To add few more points which you may get helpful I would like to emphasize on the below discussion:
    u2022 SAP NetWeaver Identity Management helps companies to centrally manage their user accounts (identities) in a complex system landscape. This includes both SAP and non-SAP systems.
    u2022 The solution provides an authoritative, single source of user information and enables self-service management of user information and authorizations using workflow technology.
    u2022 In many cases resources such as meeting rooms, PCs and mobile devices, which all may have their own identity in some context, can be included in an identity management solution.
    Out of all other points, lets discuss about Provisioning:
    u2022 The term provisioning is often used to denote user provisioning or account provisioning.
    u2022 The functionality includes:
    o creation of accounts
    o setting initial passwords
    o setting and modifying access rights
    o disabling (revoking) an account
    o deleting an account
    u2022 The overall purpose is to make sure an identity (for example a user) has the correct access to the applications.
    u2022 User provisioning products also include workflow capabilities to apply business rules to the account provisioning process and typically provide user self-service capabilities (e.g., password reset)
    (All these details I picked up and pasted here from different section of a Solutioning Material I prepared for my company to introduce IDM solutions to my customer... couldn't give here properly due to space constraints). You can understand the Importance SAP is imposing on this product for All aspects of Automating Security and Identity of Living and Non-Living staffs as well. By using this you can get more benefits besides of Provisioning which is available in separate Solutions under other products like Virsa etc. Please go through the relevant materials available in the IDM Forum (Bernhard provided u the link) to understand go for an realization assessment.
    regards,
    Dipanjan
    Edited by: Dipanjan Sanpui on Oct 5, 2009 11:42 AM

  • Sending data directly from BW To Oracle DW via Open hub Third party tools

    HI All
    We need to send data from SAP BW 7.0 to Oracle 11g DW. The requirement is to pass data through open hubs. So we wanted to explore the option of directly passing the data from BW to Oracle using third party tools. So i tried to gather information on the third party tools and came across list of APIs and some pointers on creating a RFC through SM59. But dont know how to do it or what parameters to pass. Could you please point me in the direction or list me step by step detailed instructions as to how I can achieve this?

    Hi Amit..
      I can see  the following  will make things work for  you.
    1) BODS.
    2) PI ( XI).
    3) BI7.0 to FTP and from FTP  you to push these tp  your oracle system..
    Regards,
    rajesh

  • SAP BI 7.0 to SAP PI to FTP and with Open Hub Destination ...Help!!!!

    Dear SCN Experts,
    I am currently working on scenario where I have a requirement to push data from SAP BI 7.0 to SAP PI 7.
    And I am using Client proxy for SAP BI to send the data from BI tables to SAP PI and then to write to FTP address.
    Now the challenge I am facing is the ABAP Proxy how to use it with Process Chain and also I am having Open hub destination created for the same.(Specifically with the new version i.e. 7.0)
    Can you atleast make me understand what are the steps involved in this for Client Proxy, Process Chaing, How will proxy trigger and other regarding the same.
    I have searched SDN but got the document having older versions in it. Which doesn't serve the purpose.
    Regards,
    [Gaurav Patwari|http://gauravpatwari.wordpress.com]

    Hi Michal,
    Thanks for the reply buddy.
    I know that we can run the scheduled report for the proxy to fetch the report. But the client requirement is to use process chain, Open hub destination which fetches data from the ODS to Ztable created.
    We need to fetch that data from the table via our proxy. I am familiar with that report related method.
    I have one document using the same method but is using Infospoke(which is now obsolete) so have to OHD and also some of the proxy of XI's older version. So is not helping me out.
    Please do the needful. Or can you send me some sample scenario like this with screen shots. It will be a great help.
    Regards,
    [Gaurav Patwari|http://gauravpatwari.wordpress.com]

  • Problem connecting to SAP Open Hub

    Hi, I am trying to set up a SSIS job  connecting to SAP Open Hub and have with support from the SAP guys been able to get some progress, but it has now stopped up on a error message we're not able to solve. Any suggestion on what can be wrong and
    how to solve this? When I run the package I get the following error message:
    SSIS package "D:\Source\MSBI\SapLoadStaging\Package3.dtsx" starting.
    Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
    Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
    Information: 0x40043006 at Data Flow Task, SSIS.Pipeline: Prepare for Execute phase is beginning.
    Information: 0x40043007 at Data Flow Task, SSIS.Pipeline: Pre-Execute phase is beginning.
    Information: 0x4004300C at Data Flow Task, SSIS.Pipeline: Execute phase is beginning.
    Information: 0x3E8 at Data Flow Task, SAP BW Source: Process Start Process, variant has status Completed (instance DH88PUV2SZBIFKMIF48K3USME)
    Error: 0x3E8 at Data Flow Task, SAP BW Source: Process Data Transfer Process, variant /CPMB/HMIJYDZ -> ZOH_VPL has status Ended with errors (instance DTPR_DH88PUV2SZCA46Y9QNO66A6W6)
    Error: 0x3E8 at Data Flow Task, SAP BW Source: The component is stopping because the Request ID is "0".
    Error: 0x3E8 at Data Flow Task, SAP BW Source: No data was received.
    Error: 0xC0047062 at Data Flow Task, SAP BW Source [41]: System.Exception: No data was received.
       at Microsoft.SqlServer.Dts.SapBw.Components.SapBwSourceOHS.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
       at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)
    Error: 0xC0047038 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on SAP BW Source returned error code 0x80131500.  The component returned a failure code when the pipeline engine called PrimeOutput().
    The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    Information: 0x40043008 at Data Flow Task, SSIS.Pipeline: Post Execute phase is beginning.
    Information: 0x4004300B at Data Flow Task, SSIS.Pipeline: "OLE DB Destination" wrote 0 rows.
    Information: 0x40043009 at Data Flow Task, SSIS.Pipeline: Cleanup phase is beginning.
    Task failed: Data Flow Task
    Warning: 0x80019002 at Package3: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (5) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches
    the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
    SSIS package "D:\Source\MSBI\SapLoadStaging\Package3.dtsx" finished: Failure.
    The program '[6916] DtsDebugHost.exe: DTS' has exited with code 0 (0x0)
    Regards
    Paal

    Hi Paleri,
    According to the
    thread which has the same error message, the issue may be caused by incorrect RCF settings. Could you double check your RCF connection configurations such as DNS settings?
    If it is not the case, please also make sure you have installed the correct version of Microsoft Connector for SAP BW.
    Reference:
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/e2fbafe5-d9df-490a-bfad-3d4b9784a8ea/sap-bi-connector-for-ssis-2008?forum=sqlintegrationservices
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Open Hub Destination - SAP CRM

    Hi,
    We would like to transfer bulk data from SAP BI (Data Store Object) into SAP CRM. Performance of APD has been very poor. As an alternative, we are checking the feasibility of using InfoSpoke/Open Hub service to transfer data from data store object to ADS (analytical data store i.e. custom table in CRM).
    Has any one tried this before?
    Can we use remote database (i.e. not SAP BI) in open hub destination?
    Any help will be highly appreciated.
    Thanks.

    Hi ,
    Yes I have used Info Spoke/Open Hub service to transfer data from my SAP BI to other Sap system.
    We have a process chain which has start node and 3 parallel node extracting data from 3 different ODS.
    All 3 nodes are connected to a AND node , after then there is a programmer  (Unix script) which has does FTP, Abap program can also be used for this.
    While doing FTP , give destination address CRM system.
    I am sure this will work.
    Milan Kothari
    9822840385

  • SAP Open Hub Vs SAP JCO for Extraction

    Hi,
    I would like to clarify for SAP JCO connectivity from some third party tool to SAP BW Server in compare to open hub:
    We want to connect our internal third party tool to SAP BW/BI server by using Java technology through SAP defined JCO. Later we will create RFC and BAPIs at SAP BW server for data extraction. Those defined BAPIs and RFCs we would call from my java programs.
    In such kind of scenario we can extract metadata as well as data for all defined data targets (InfoCube/ODS/DSO/InfoObject) and also can make otherdata load scheduling program to extract data and store it at my non-SAP application.
    As per my understanding since we are using SAP provided JCO so there won't be any issue of license for the same. And Open hub require license if it is used to extract and send data from data targets to third party tools.
    Can someone confirm for license out of above mentioned cases when exactly it would be required. Mainly I would like to know if we doun't use SAP provided JCO for connectivity and data exttraction then license woould be required or not?
    You speedy response would be highly appreciatted.
    Regards,
    Vivek

    hi,
    refer this links
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b6d59590-0201-0010-5f9d-adb9f6b14d80
    help.sap.com/bp_biv270/documentation/SAP_BW_3.5_Functoin_Detail.pdf
    www.sapbasis.ru/library/ep_tech_and_program/ep_tech_and_program-7_8.pdf
    thanks
    naresh

  • Open Hub Destinations - Write directly to a table on 3rd party DB.

    Hi All,
    Is there way to create a table in the Non SAP system ( like 3rd party DB ) using Open Hub Destinations?
    If yes what are the steps.
    Thanks,
    Hima

    There is a 3rd party extractor to load data from BI into SQL Server (and vice versa) using Open Hub:
    [ http://msdn.microsoft.com/en-us/library/dd299430.aspx|http://msdn.microsoft.com/en-us/library/dd299430.aspx]
    Summary: This white paper demonstrates the use of the Microsoft Connector 1.0 for SAP BI in Microsoft SQL Server 2008 Integration Services packages. It shows how to load data into SAP BI by using the SAP BI destination, how to extract data from SAP BI by using the SAP BI source, and how to prepare extracted data for analysis in SQL Server Analysis Services.

  • Use of Open Hub Destination to load data into BPC

    Hi Gurus,
    I want to load the data from SAP BI to BPC (NW version), using Open Hub Destination. I want to know few things about this.
    1) What method should I use? Should I use Destination as Flat Files or Database tables? If Database tables, how can I use this data in the tables in BPC?
    2) If I go for Flat Files, which  is saved in the application server of BI, how can I import those files to BPC and use it using Data Manager?
    3) Also, in case of Flat Files, there are two files which are created. One is the control file and other one is the data file. How can I use both of them? Or can I just use data file without header?
    Your replies will be much appreciated.
    Thanks,
    Abhishek

    Hi Anjali,
    I can use the standard data manager package from BI to BPC, if the CSV file is available in the BPC Server or in case if I am directly extracting from BW object, InfoObject or InfoCube.
    But, since I will be using Open Hub, the output of this can be in a Database Table or a CSV file, preferably in SAP Application Server. In such cases, how can I use the Database table or CSV file in the standard Data Manager Package?
    Thanks for your reply.
    Abhishek

  • Open hub error when generating file in application server

    Hi, everyone.
    I'm trying to execute an open hub destination that save the result as a file in the application server.
    The issue is: in production environment we have two application servers, XYZ is the database server, and A01 is the application server. When I direct the open hub to save file in A01 all is working fine. But when I change to save to XYZ I´m getting the following error:
    >>> Exception in Substep Start Update...
    Message detail: Could not open file "path and file" on application server
    Message no. RSBO214
    When I use transaction AL11, I can see the file there in XYZ filesystem (with data and time correspondent to execution), but I can´t view the content and size looks like be zero.
    Possible causes I already checked: authorization, disk space, SM21 logs.
    We are in SAP BW 7.31 support package 6.
    Any idea what could be the issue or where to look better?
    Thanks and regards.
    Henrique Teodoro

    Hi, there.
    Posting the resolution for this issue.
    SAP support give directions that solved the problem. No matter in which server (XYZ or A01) I logon or start a process chains, the DTP job always runs in A01 server, and it causes an error since the directory doesn´t exist in server XYZ.
    This occurs because DTP settings for background job was left blank. I follows these steps to solve the problem:
    - open DTP
    - go to "Settings for Batch Manager"
    - in "Server/Host/Group on Which Additional Processes Should Run" I picked the desired server
    - save
    After that, no matter from where I start the open hub extraction, it always runs in specified server and saves the file accordingly.
    Regards.
    Henrique Teodoro

  • Cube to Open Hub DB destination - Aggregation of records

    Hi Folks,
    I am puzzled with BW 7.0 open hub DB destination in regards to aggregation.
    In BW 3.5 open hub DB destination I got from the cube already aggregated records depending what which fields I select. E.g. cube has cal week/ cal month   but in infospoke only cal month selected I get only one record per month (not several ones -> one for each week of the month).
    In BW 7.0 open hub destination it seems to be different. Although Cal Week is not used in any transformation rule and not part of the destination defintion I still get all weeks of the month as single records. In theory not a problem if the records would be aggregated according the sematic key of the open hub destination db. But here an error is issues -> duplicated record short dump.
    So do I get it right that with BW 7.0 record aggregation e.g. Cal Week / Month  ---> Cal Month is not possible at all? Or do I something wrong?
    Will need to have an intermediate DSO in between or is there another way to get teh aggregation for open hub "direclty" working?
    This is quite a shortcoming of the open hub. Not mentioning the non-availability of nav_attributes + source only from cubes not from multiproviders ...  seems that the open hub in BW 7.0 got worse compared to BW 3.5
    Thanks for all replies in advance,
    Axel

    Hi Axel,
    We can use 0CALMONTH in open hub destination. In BI 7.0 we can not extract data from Multi Provider using Open Hub.
    But in BW 7.30 we have this functionality( using DTP we can extract data from multi provider to OHD).
    No need to use intermediate DSO, we can extract data directly from Info Cube.
    Please check the below documents.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/501f0425-350f-2d10-bfba-a2280f288c59?quicklink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/5092a542-350f-2d10-50bd-fc8cb3902e2e?quicklink=index&overridelayout=true
    Regards,
    Venkatesh

  • Filtering data in Open Hub

    In the info-spoke definition, is there a way to make filter values dynamic in the 'Selection' tab.  For example, we have an info-spoke that we extract data to monthly - by fiscal period.  Since the Production instance is a closed system, we cannot change the filter value directly in Production, but would have to change it in Development and transport it up to Production.  I have read in some SAP documentation that there is a plan to have variables available as selection criteria but it was not mentioned when this would be available.  Thank you for your help.

    Hi Celine,
       Why you want to change the selections.... why cant you use delta...?
       If you want to do selection accourding to ur fiscal year create a BADI and create a variables from system data. and restrict according to ur selection.
       for more info check at www.service.sap.com/bi -> infoindex -> O(Open hub destination).
    I hope this helps.
    Srini

  • Open hub services

    hello Friends,
    Can you please tell me what is the use of open hub service ?
    As i believe that it is sending a data to flat file ? BUT even though we can also directly send a data through export datasource.
    i just want to know is there any additional advantages for open hub.?
    Thanks in advance

    Hi,
    The open hub service enables to distribute data from an SAP BW system into external data marts, analytical applications, and other Non-SAP applications. With this, you can ensure controlled distribution using several systems. The central object for the export of data is the “InfoSpoke” (SAP BW term). Using this, one can define the object from which the data comes and into which target it is transferred.
    Through the open hub service, SAP BW becomes a hub of an enterprise data warehouse. The distribution of data becomes clear through central monitoring from the distribution status in the BW system.
    regards
    Vc

Maybe you are looking for

  • How to view contents of file without opening the file?

    Does anyone know of a utility that will allow you to quickly view the contents - or a part thereof - of a file (particularly Word files) without having to open the file? I have thousands of files restored after a disk failure, but their filenames hav

  • Can the tilt shift blur filter be used in a smart object?

    can the tilt shift blur filter be used in a smart object?

  • I don't want website names to pop up as I type

    When I type a web address into the browser bar, website suggestions show up based on my history. I don't want these to pop up.

  • Barcelona map is not updated

    Dear all,ç Today I have updated my map of Catalonia where is located Barcelona. In Barcelona the last summer the city council changed the direction of Craywinckel street and part of  Passeig Sant Gervasi. Today, 3 months later this map is not updated

  • Getting setup assistant every time I open itunes

    Hey everyone I get setup assisant everytime I open itunes. My content is fine, the library is there in full but it just pops up everytime... and gets annoying (though its not a large issue). For what it's worth I did transfer my music from another PC