Open Hub DB table

Hi  Gurus,
Please let me know, in which database table, can I find the description of open hub.
e.g. If i define a open hub destination with id ZTESTOHUB and description Test Open Hub, then in which table & field, is description Test Open Hub stored ?
Regards
MG

Hi,
You can use the table RSBSPOKET which contains List of all InfoSpokes with the Short & Long Descriptions (only one of these can be maintained). 
The list of all table related to info spokes are as follows: Might be helpful for future use
RSBSPOKESELSET  InfoSpoke Directory and Selection Options
RSBSPOKEVSELSET  InfoSpoke Directory with Selection Options and Versioning 
RSBSPOKE  List of all InfoSpokes with attributes maintained with transaction RSBO which include the name of
the Source & Target Structures 
RSBSPOKET  List of all InfoSpokes with the Short & Long Descriptions (only one of these can be maintained). 
RSBSTEPIDMESS  Contains all the messages that have been recorded during the execution of an InfoSpoke. This table can
be added to using the ABAP Class/Method i_r_log->add_sy_message.
Hope this helps.
thanks,
rahul

Similar Messages

  • Program or Function module to delete data from Open Hub Destination Table

    Hi All,
    Can anybody suggest me a Program or Function module to delete data from Open Hub Destination Table.
    Thanks & Regards,
    Vinay Kumar

    You can simply goto t-code SE14 mention the open hub destination table and Delete data by clicking on "Activate and Adjust database" with radio button "Delete Data".
    Regards,
    Arminder

  • Open Hub Destination table structure update

    Hi,
         I created a open Hub detsination and activation created the table  and populated the data in the table and have a view over the table.
         I need to add few more fields in the open hub and when I did that and activated again , it is supposed to change the table structure adding the new fields that were added in the open hub.
       The activation was successful but the table structure still stays the same.I didn't see the new fields in the generated table.
       Anyone having similar issues.
      Thanks in advance

    We deleted all the related objects from Staging and production and re transported them from Development as though they are being sent for the first time and that works.Tough this approach is not good we need to use it as I didn't find any other solution

  • Open Hub Destination Table Vs File

    Hello Experts,
    We have a requirement to send data from BW to 3rd party system every day. In the Open hub destination we have two options for the target.
    1. Table
    2. File
    ETL tool like Informatica will extract the data from table or file into 3rd party sytem.
    Which way is the best in performance and maintenance?
    Thanks in advance
    Sree

    Hi,
    If you follow the delta mechanism then table is good. You can also go for File not a problem. See the both eamples in the following articles.
    If you use Fiels then go for Application Server option, i.e. AL11 becasue no one can change teh files in this location, you can restrict the authorizations.
    Open Hub Destination: Part 1
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%201.pdf
    Open Hub Destination: Part 2
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%202.pdf
    Thanks
    Reddy

  • Help required for Creation of process chain for Open hub destination(table)

    Hi Experts,
    I am working on creating process chain for the open hub destination which is targeting data transfer into Table. Which is then push to SAP PI.
    2.) Steps to create a process chain for OHD????? Not knowing and even after that
    3.) What are the steps for Pushing it to PI (I mean we have generated client proxy also) But need to know ABAP related to trigger process chain
    please need your expertise and knowledge. Well atleast with the point 2 I need BI experts help...
    Regards,
    [Gaurav Patwari|http://gauravpatwari.wordpress.com]

    Hi Gaurav,
    Creating Process chain for OHD.
    1. Go to RSPC-> Create a astart variant.
    2. Click on process typesin the menu. Fourth from the left. Under this go to LOAD PROCESSES AND POST PROCESSING option.
    3. The fifth option is "DATA EXPORT INTO EXTERNAL SYSTEMS". Click on that.
    4. Click on F4 and choose your infospoke name.
    5. Press the RIGHT button.(if it asks for variant, create a new variant in the same small window besideswritting bar).
    6. Join the START process and the 2nd process via dragging.
    7. Now add your required other processes.
    The general process chain processes are :
    1. Start.
    2. Delete Indexes (if data loaded to cube).
    3. Load process.
    4. Delete duplicate request (if it is a full load in cube).
    5. Create Indexes.
    Hope it helps.
    Preet

  • Getting data from Open Hub Destination Table

    Hi,
    I've created InfoSpoke and as a Target I've used Table. Now I want to get to data in this table from R/3 side. Any idea where this Table might be stored and how can I get to it?
    Thanks in advance!
    Regards
    Wojtek

    hi,
    is the table is in R3 side?
    chk the application component name give in the infospoke definition.
    if the table name is known then chk out in se11/se12/se16 in R3.
    Ramesh

  • How to delete data from Open Hub DB table when the DTP is inactive..

    Hi Gurus,
    I am transporting some 2LIS extractors to QA with some enhancements.
    The data goes to an OHD ( third party ) pulled by a process outside SAP enabled by a BAPI.
    Now , By mistake , I imported the OHDs with still some old left in the DB table of OHD ( /bic/oh<>)
    It's not allowing to me do manual activation and also DTP is inactive ( cannot do a delta DTP and make no of records zero )
    I want to delete the data in the OHD DB tables ..any clues....
    Thanks,
    Chetan

    Hi Chetan,
    The OHD table name is usually /BIC/OH<OHD-name> .  You can find it in the OHD definition.
    Go to transaction se14, give the table name and say edit.
    Select Delete Data and say Activate and adjust database.
    This will delete the entire contents of the OHD DB table.
    Regards,
    Uma Maheswari

  • Open Hub with Attributes and Texts

    Hi,
    I´d like to know if it´s possible to have one open hub (database table) with texts and attributes.
    There´s no problem when I create it or it´s transformations and DTP´s, but when I try to load texts, after attributes load, it always generates a dump: SAPSQL_ARRAY_INSERT_DUPREC.
    I´d like to know a way without using a DSO, any suggestions?!
    Kind Regards,
    Tomas

    Hi,
    Pls chk SAP notes system, check SAP note 668466 for some help for this error.
    *As according to SAP NOTE ;668466*
    Summary
    Symptom
    You load master data attributes or texts for a characteristic. The loading terminates with runtime errors of the type SAPSQL_ARRAY_INSERT_DUPREC or with error message RSDMD 199. The termination point is in the DBFIRE method of the CL_RSDMD_UPDATE_MASTER_DATA class.
    Other terms
    Loading master data, attributes, texts, SAPSQL_ARRAY_INSERT_DUPREC, CL_RSDMD_UPDATE_MASTER_DATA, DBFIRE
    Reason and Prerequisites
    When you load attributes or texts, data records are usually inserted in the master data tables of the characteristic using an array insert statement (P, Q, X, Y, T tables). This results in a SAPSQL_ARRAY_INSERT_DUPREC runtime error if data records that are to be inserted and that relate to the unique primary key of the relevant database table already exist in this table. The following possible reasons are currently known for the occurrence of the SAPSQL_ARRAY_INSERT_DUPREC runtime error:
    1. Data inconsistencies in the master data tables of the characteristic
    SAPSQL_ARRAY_INSERT_DUPREC runtime errors or RSDMD 199 may occur if the data appears in an inconsistent status in the master data tables of the characteristic before you load attributes or texts. Notes 323140, 566044, 592757 and 599269 describe this (among other things) in more detail.
    2. Parallel loading of nondisjunct data packages
    If master data attributes are loaded over several dialog processes that are running in parallel, you must guarantee that the data packages of the relevant request are strictly disjunct concerning the characteristic value. This means that all data records that belong to a certain characteristic value may only be contained in a single data package. If data records of a certain characteristic value are distributed over several data packages, this may cause conflicts during the parallel loading of these packages when the master data tables are accessed and this may result in SAPSQL_ARRAY_INSERT_DUPREC runtime errors.
    The same conditions that apply to loading attributes also apply to loading language-independent master data texts. If language-dependent texts are loaded in parallel, the data packages must be disjunct in relation to the combination of characteristic value and language indicator.
    Note 566044 also describes the problematic nature of nondisjunct data packages or duplicate or overlapping data records.
    3. Parallel loading of time-dependent attributes or texts
    When you load time-dependent attributes or texts, a data record that is to be loaded is assigned to a characteristic value and also specifically to a time interval within which the contained attribute values or texts are valid. For this reason, requests for loading time-dependent attributes or texts typically contain several data records for each characteristic value. If data records of a certain characteristic value or the combination of characteristic value and language indicator are distributed over several data packages, this may (as already explained) cause conflicts when the master data tables are accessed and may result in SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199.
    4. Delta update of an ODS object in a characteristic
    If you use a delta update from an ODS object to load master data attributes or texts, the activation of the ODS object determines the contents of the data packages that are used for loading the attributes or texts. If all of the ODS loading requests that are used as the data basis for the update into the characteristic are activated together in one step, this guarantees that the system sets up the data packages strictly disjunct in relation to the key fields of the ODS object. If the key fields of the ODS object are displayed one for one on the compounded characteristic value in the update rules, the data may be updated to the characteristic in parallel and by package.
    However, if ODS loading requests that were activated at different times appear in the data basis for the update, data records of a certain ODS key or characteristic value may be distributed over several data packages. As already explained, this may cause conflicts when you access master data tables of the characteristic, and it may result in SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199.
    This problem is also described in Note 666213.
    5. Simultaneous loading of attributes and transaction data
    If, when loading attributes of a characteristic, you discover that there are still no entries in the master data tables (P, Q) for a specific characteristic value, the system prepares to insert corresponding data records into these database tables. You can load transaction data into an InfoCube or an ODS object, for example, with the option to create missing master data entries for characteristics that involved during the loading. If a request now creates entries for transaction data in the master data tables (P, Q) of the characteristic before the process has inserted corresponding data records into the master data tables (P, Q) for loading attributes for this characteristic, this may result in a SAPSQL_ARRAY_INSERT_DUPREC runtime error or error message RSDMD 199.
    This problem may also occur if you load attributes simultaneously for a characteristic that contains the other characteristic as a navigation attribute or compounding part. If a characteristic is compounded to other characteristics or if it contains characteristics other than the navigation attributes, the system also creates SID values for these dependent characteristics when the attributes are loaded depending on the settings in the InfoPackage.
    Solution
    To repair a possible inconsistency in the data, implement the RSRV test for master data.
    To avoid SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199 when you load master data, carry out the following actions. The actions to be performed in each case depend on which of the above-mentioned reasons is responsible for the runtime error:
    1. Data inconsistencies in the master data tables of the characteristic
    If data inconsistencies are responsible for the runtime error, you can usually use the tests in transaction RSRV or the RSDMD_CHECKPRG_ALL program to solve this problem. Notes 323140, 566044, 592757 and 599269 describe these (among other things) in more detail. If you cannot implement an automatic correction in this way, you may have to use an ABAP program that is adjusted to the relevant situation to restore the data consistency.
    2. Parallel loading of nondisjunct data packages
    If the runtime error does not lead back to the parallel loading of nondisjunct data packages, you must identify the duplicate data records that cause the problem. As described in Note 566044, you can change the insertion into master data tables from one array insert to an individual insert statement. If you use a single record insert statement to insert data records, duplicate data records are logged in the monitor by error messages (RSDMD 196). If the data records of a failed request are still available in the PSA, you can also search for duplicate data records directly in the relevant PSA table.
    You may also create duplicate data records if key values are changed in the transmission rules or update rules. If relevant rules are defined, you should check these to see if they can possibly create duplicate data records. If this is the case, you must correct the rules accordingly.
    3. Parallel loading of time-dependent attributes or texts
    The extractors used for loading time-dependent attributes or texts cannot guarantee that all data records assigned to a certain key value are contained together in only one data package. For this reason, to avoid SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199, you should not load time-dependent attributes and texts in parallel. Data packages are processed serially if the 'PSA only' and 'Update Subsequently in Data Targets' settings under 'Processing' are used in the InfoPackage.
    4. Delta update of an ODS object in a characteristic
    To avoid SAPSQL_ARRAY_INSERT_DUPREC runtime errors or error message RSDMD 199 during the delta update from an ODS object to a characteristic, you must ensure that the data basis used for the update to the characteristic consists exclusively of ODS loading requests that were activated together in one step only. Note 666213 describes in detail how you can ensure this.
    5. Simultaneous loading of attributes and transaction data
    If the runtime error occurs because another load task creates SID values or master data for the same characteristic at the same time when attributes are loaded, you can only solve the problem by rescheduling. The load task in question must be scheduled in such a way that you can exclude simultaneous processing.
    *pls assign points,if info is useful*
    Regards
    CSM reddy

  • Open Hub and Informatica

    Hello All,
    We are facing a situation, we need to extract data from SAP BW  and load it in informatica,
    We have created a openhub.
    The data in the targets are large , we are able to load the data into open hub quickly ( 6million records in 15mins), but when the data is transferred in to informatica the time taken is huge ( 30-45 rows/sec) when we ran a trace we couldn't find any reason why the data is extractin so slow.
    Request you to please let me know what all checks can I do to find the bottleneck and what kind performace improvement I should think of doing.
    Regards,
    Ravi

    Hi Kumar,
    Informatica has a connector for SAP BW, integrating as a 3rd Party RFC destination via open Hub.
    In the process chain to load the open hub you also need to include a custom ABAP program (distributed via the Informatica DVDs), which is mainly of event logging and informing the listener service on the Powercenter side when the load to the open hub destination table is finished.
    We will be using the same method, although we're just in blueprinting at the moment and haven't doe a POC yet. I've got a document, contact me for more info.
    Regards
    Steffen

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • Open hub for PSA to Database Table...

    Hi
    My requirement is to use Open Hub for loading data from PSA (DataSource) --> to some BI table.
    Question is: Can i make Data Source as Source of Transformation for Open Hub?
    Once i try to do, i get an error of "Cannot connect DataSource to Open Hub Destination"
    If we can't use Data Source as source of transformation in Open Hub, could u pls. suggest another way to fulfil my requirement. I dont want to create Infoprovider from my DataSource for Source of my Open Hub.
    Thanks

    Hi Harpal,
    You can have data source as source of your transformation for Open Hub Destination.
    Just check if your data source is active in the system or not.
    Regards,
    Durgesh.

  • Open Hub - Change from File to Database table

    Hello Experts;
    I want to change the Open Hub Service that we create from File to Database table.
    We are using the Delta Update. However, after I change the Open Hub, Activate the transformation and DTP, when I run it, I don't have any value in the table.
    If I run a new delta into my cube, in the table I only get the data from delta.
    Can anyone knows how to upload all the requests that I have in my cube for the first time I run the delta DTP?
    Thanks and regards;
    Ricardo

    Create a new Full DTP and execute, so that you can dump all of the current contents of the InfoCube into the table and push/pull that table to the target prior to the next delta retraction.
    Edited by: Dennis Scoville on Nov 17, 2009 9:37 AM

  • Open Hub 3rd party tool - API not returning data table

    I am implementing a 3rd party tool to get data from BI using the Open Hub Service.
    I have my server running and the API functions are working somewhat, but I have two problems.
    When I get notified by BI (RSB_API_OHS_3RDPARTY_NOTIFY), the REQUESTID field is always 0. Why? This should be the request id of the Data Transfer Process; I can see this id in BI.
    The second problem is when I make the call to read the data (I hard code the request ID), everything comes back fine, except the table is empty. I know it's not empty because the DTP said it create a table with many rows and I can see them in BI. Why is the table empty?
    Any help is appreciated. Thanks.
    Tim
    PS: I'm using SAP JCo to implmenet the server and the API. And I'm using BI 7.0.
    Edited by: Tim Wise on May 17, 2008 4:58 PM

    I am not sure I should continue: I am not an expert with the use of these API - in fact I have never used them:-)
    This is in case the discussion would help you:
    My understanding is that the 'green' status is the status of the 3rd party tool.  Only when the 3rd party sets a 'green' status will extraction proceed.  The third-party can choose to stop extraction by setting a 'red' status.
    See the link below:
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/8dea54c4e273e2e10000000a1553f6/content.htm
    Guess all these APIs are geared for automatic processing - another link:
    http://help.sap.com/saphelp_nw70/helpdata/EN/59/90070982d5524b931ae16d613ac04a/content.htm
    RSB_API_OHS_3RDPARTY_NOTIFY:  This function module has no source code - the code needs to be implemented in the 3rd party tool to do whatever is required with the parameters provided.  The import parameters required have to be supplied to it.
    That's it, I am retiring:-)
    Mathew.

  • Open hub destination as ECC table

    Hi All,
    Is it possible to create a open hub destination to an ECC table?
    Can only see the option for csv, tabke and thrird party.
    Thanks,
    Nick.

    Alright Arun,
    Yeah I was thinking of that.  If we have 100's of R3 tables where all the data needs extracting, then cleansing then imported to another R3 system would you consider doing this in BI or would you go R3  to R3???
    Cheers,
    Nick.

  • Delete the initialization of delta on open hub table

    HI Gurus
    I have loaded data from cube to open hub table using DTP with full update, later on i have loaded the data using a DTP with delta update. Now i am thinking that delta has been initialized on the open hub table, so now want to delete the initialization of delta on the open hub table. It would be great if some one could send me the response. 
    Thanks
    Rohit

    Hi Sam,
    If I understand you correctly,
    You have removed your Initialisation by going InfoPackage --> Scheduler --> Initialization Options for Source --> Removed the Request
    Then you did Init without Data Transfer
    Then you have the option from init withoout Data transfer to Delta in InfoPackage and Loaded the Data.
    Check in the Process Chain the InfoPackage you have mentioned in the selections may pointing towards Init load instead of Delta.
    Sudhakar.

Maybe you are looking for

  • I want to cancel my broadband and pay off my debt.

    Recently I was shocked to discover that my direct debit had been altered to £73.50 a month! I was informed, upon contacting your billing people, that this was due to a number of phone calls I had made.  I have since discovered that the internet phone

  • Changing a custom report to be time sensitive

    Hi I posted this in FI too thought the MM people might have some inputs as to how to correct the Zvenclass if we use the standard SAP field on the accounting view table LFB1 MINDK minority indicator We need to change the custom report zvenclass, it i

  • [SOLVED] pacman download errors

    the 2008.03-1.archboot.ftp.iso is the only iso that worked for me. after the install, i logged in using root. I ran pacman -S xorg but got an error. I checked the mirrorlist file and everything was uncommented. I ran pacman -Syy and got many errors.

  • Process Order cannot be reversed

    Hi PP Gurus, When reversing process order for confirmed scrap quantity using tcode CORS. System will prompt an error message. Please advice if confirmed operation scrap quantity for process order can be reverse? Thanks, Sodacracker Edited by: sodacra

  • Is it possible that malawere "crisis" could infect Mac os x?

    How can I discover if adobe flash player just installed is totally genuine? OSX Crisis, discovered last month, was soon found to be cross-platform – detecting whether the OS is Windows or Mac, and responding accordingly. Now Symantec believes it may