Open Hub extraction for Hierarchies

Hello,
I need to create a Open Hub on the 0MAST_CCTR in HR-PA. Unfortunately the Open Hub does not give you the option for hierarchies. Any ideas?
Thanks,

Hi,
  If it is a flat file output then you can use the below program mentioned in SAP how to document. We are using this in our system.
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/0403a990-0201-0010-38b3-e1fc442848cb?quicklink=index&overridelayout=true
Regards,
Raghavendra.

Similar Messages

  • To create an header for Open Hub extract

    Hi guys,
    How do i get an header for an open hub extract for a CSV file, an header consisting of the field names technical or otherwise.
    Thanks,
    Your help will be greatly appreciated

    hi,
    Field definition
    On the Field Definition tab page you define the properties of the fields that you want to transfer.
    We recommend that you use a template as a basis when you create the open hub destination. The template should be the object from which you want to update the data. This ensures that all the fields of the template are available as fields for the open hub destination. You can edit the field list by removing or adding fields. You can also change the properties of these fields.
    You have the following options for adding new fields:
    ●     You enter field names and field properties, independent of a template.
    ●     You select an InfoObject from the Template InfoObject column. The properties of the InfoObject are transferred into the rows.
    ●     You choose  Select Template Fields. A list of fields are available as fields for the open hub destination that are not contained in the current field list. You transfer a field to the field list by double-clicking on it. This allows you to transfer fields that had been deleted back into the field list.
    If you want to define the properties of a field so that they are different from the properties of the template InfoObject, delete the template InfoObject entries for the corresponding field and change the properties of the field. If there is a reference to a template InfoObject, the field properties are always transferred from this InfoObject.
    The file or database table that is generated from the open hub destination is made up of the fields and their properties and not the template InfoObjects of the fields.
    If the template for the open hub destination is a DataSource, field SOURSYSTEM is automatically added to the field list with reference to InfoObject 0SOURSYSTEM. This field is required if data from heterogeneous source systems is being written to the same database table. The data transfer process inserts the source system ID that is relevant for the connected DataSource. You can delete this field if it is not needed.
    If you have selected Database Table as the destination and Semantic Key as the property, the field list gets an additional column in which you can define the key fields for the semantic key.
    In the Format column, you can specify whether you want to transfer the data in the internal or external format. For example, if you choose External Format here, leading zeros will be removed from a field that has an ALPHA conversion routine when the data is written to the file or database table.
    Assign points if helpful
    Cheers!!
    Aparna

  • Problem with a Open Hub Extraction

    Hi Experts,
    I have a huge problem here.
    I´m trying  to extract some data with an Open Hub application, but i´m having trouble with the application server.
    See, I have to extract the information from a DSO to a file in a application server, but the company has one application server for each environment (BD1, BQ1, etc) and the conversion it´s not working and the transportation of the request always fails.
    Anyone knows how to fix this??
    Thanks.
    Regards.
    Eduardo C. M. Gonçalves

    Hi,
    While creating the open hub, you need to maintain the file name details under T-CODE - FILE.
    I hope you have maintained that.
    There also you wil have to use the variable to change the file path which will change the file path in each system..
    There you will have to define
    Logical File Path Definition
    Assignment of Physical Paths to Logical Path
    Logical File Name Definition, Cross-Client
    Once you have define this and transport , then your Open Hub will go smooth.
    Thanks
    Mayank

  • Open Hub Extraction

    Hello
    I am doing a open hub and sending data to a file. Now in the file i want to maitain the header line(description of field) for the fields i am sending to the file.  How can i do that. Please can someone explain me a little bit in brief.
    thanks

    Hi,
    Once u run the Infospoke, it will generates two files, data file and schema file. The data file contains the data and the schema file contains the deader or the structure of the Infospoke. So if u want to maintain the header in the data file, u just copy the list of InfoObjects name and paste it on the header file.
    Thanks & Regards
    Ramakrishna Kamurthy

  • Open Hub Extract in .txt file

    Hi Expert,
    A abc.txt file is getting populated as a target of open hub. That file is populated through a process chain. DTP monitor screen shows that output data volume is around 2.8 million everyday, but I am not been able to find data more than 1250000. Every time whatever be the data target in dtp monitor it shows me 1250000 in the txt file. Is there any limitation on the number of rows in .txt file.
    Please help.
    Regards,
    Snehasish

    thanks for the reply Geetanjali.
    Yeah one field routine is there , but from dtp monitor screen i can see that total record count is 2608000 but when i m chking from the .txt file . it is showing me 1250000 irrespective of the dtp monitor record count. Moreover, I m just populating 3 column in my target txt file.

  • Open hub services for badis implementations

    I want to implement a openhub services for badis.
    could u please send me a screen shot implementation am new to this task.
    thanks for help

    Hi Madhavi,
                    Can you please  send  those  docuemnts to me also.
    Mail-Id : [email protected]
    Thanks in advance
    Regards
    Ramakanth.

  • Open Hub - Routine for KeyFigures possible?

    I created an OpenHub destination and to export data from a cube. When I now want to modify the exported value for a key figure in a routine for this transformation, I do not see the default template "RESULT = ." and keep getting syntax errors for ABAP code that works fine in other transformation routines of the same cube.
    When using a formula instead, I also get error messages saying "Target parameter is not being used" and "Cannot compile formula" even though the formula is very simple.
    I am now wondering whether I am doing something wrong or whether it is not possible to change values in transformations for OpenHub exports.
    Thanks for clarifying,
    Dennis

    Hi Sasidhar,
    Please find the below links,
    INFOSPOKES:
    http://help.sap.com/saphelp_nw04/helpdata/en/66/76473c3502e640e10000000a114084/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/c5/03853c01c89d7ce10000000a11405a/frameset.htm
    http://help.sap.com/saphelp_nw04/helpdata/en/59/90070982d5524b931ae16d613ac04a/frameset.htm
    http://help.sap.com/saphelp_nw2004s/helpdata/en/ce/c2463c6796e61ce10000000a114084/frameset.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/43f92595-0501-0010-5eb5-bb772d41ffa4
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5f12a03d-0401-0010-d9a7-a55552cbe9da
    http://help.sap.com/saphelp_nw04/helpdata/en/66/76473c3502e640e10000000a114084/frameset.htm
    Hope this helps,
    Sai.

  • Open Hub third Pary extraction in yellow state forever

    Hi,
    Would appreciate any assistance in this regard.
    Here's the situation:
    1 - We have third party open hub extraction setup.
    2 - All settings are fine : RFC destination, transformations, Open hub.
    3 - Now the issue is that we have a third party system which is data stage and teradata.
    4 - Now they trigger a job which in turn starts the PC in BW.
    5  - This PC has 2 steps only. The start and the DTP.
    6 - This DTP loads data from Info-object attributes to Open hub (basically a table).
    7 - The DTP technical status is green but overall status stays yellow.
    8 - Now I know that for 3rd party, we have several APIs and here's my understandin:
    When the extraction is finished and technical status is green, BW notifies 3rd party through RSB_API_OHS_3RDPARTY_NOTIFY.
    Then the 3rd party checks the data through RSB_API_OHS_DEST_READ_DATA.
    Then if data is fine, the 3rd party send confirmation through  RSB_API_OHS_REQUEST_SETSTATUS and the overall status turns to green.
    Now I do not understand what's wrong because all of the above mentioned should complete automatically.
    We have other openhubs with same RFC destination and they are completing successfully. Just for the information: We have recently moved to SP 10 on EHP1.
    Regards
    Debanshu

    Were you able to solve this? I am also facing exact same problem.
    Regards
    Pankaj

  • Issue for DTP from DSO to open hub destination

    Hello Gurus,
            I have a issue for DTP from DSO to open hub destination, long text for error in the monitor is as follows:
              " Could not open file
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2 on application server"
              " Error while updating to target ZFIGLH03 (type Open Hub Destination)     "
          for open hub destination, I check the configure for logical file name , which is "
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr",
    I am wondering where that file " 
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2" in the error message comes from?
    Many thanks,

    Hi
    You do not need to create a file in application server. It will be created automatically.
    But if you have defined a logical file name in tcode FILE and used that in OHD and if it is not correct then it will show a conflict. Check this out.

  • Modifying destination Directory of Application server for Open Hub

    Hi All,
    i want to load a csv file on application server using Open Hub.
    when i have created the open hub i have specified the server name, the file name and the directory.
    but the Basis guys told me that i should use another directory, they have created me a Unix Directory but the problem that i can't modify the directory path in my open hub.
    can you tell me please what is the problem?
    another question why i can't modify directly in Production System this parameter of server name, directory for open hub,
    in production i can create the open hub but when i have transported some one from dev to prod i can't modify in prod, i have controlled from transport tool the possibility of modication object and i see that for opne hub is "original modificable"
    Thanks for your help
    Bilal

    i have read in some forum that to " to change the server name or logical file name we can do so in the following table:
    RSBFILE"
    i tried to see the data of this table and i found 2 records for my open hub, 1 for Active version and 1 for Modified version
    i tried here to change the path name of the directory and after saving the system give me the message that the active version and the modified version of my open hub are not equal, when i try to activate again my open hub the system take the old
    directory path.
    is this is the right table to change the directory path on application server of my open hub? or there is another table?
    i don't work with logical filename, i work with file name and directory path name.
    thanks for your help
    Bilal

  • Open hub destination of text and attribute for Functional Area

    Hi Expert
    I have a requirement to accommodate both attribute ( 0FUNCT_LOC_ATTR)  and text ( 0FUNCT_LOC_TEXT ) for functional area in open hub destination .For this a DSO has been created from both 0FUNCT_LOC_ATTR, 0FUNCT_LOC_TEXT data sources .Problem is that DSO has 0FUNCT_AREA ( Functional Area ) as only key fields . So text for both english ( EN ) and Dutch ( NL) for particular 0FUNCT_AREA not getting loaded .If we assign both 0LANGU and 0FUNCT_AREA as composite key of dso , data from attribute transformation is loaded as blank as there is no language filed in attribute data source ( 0FUNCT_LOC_ATTR ).   Please let me know how can I design to accommodate both text for different language and functional area for open hub destination  .
    I know that I can create separate open hub destination for master data TEXT and Attribute.
    Any help is highly appreciated.
    Regards
    Saikat

    HI Saikat,
    You can create an additional characteristics counter and make it a key field in the DSO.After that we can populate this characteristics in the end routine .It value would increase by 1 everytime  different text comes for the same functional area.
    Sample code
    data: counter type n,
             w_farea type /bio/oifunc_area.
    Sort result_package by func_area.
    count =0.
    *checking whether we have the same functional area or we have a different functional area.
    Loop at result_package assigning <Result_fields>.
    if w_farea is initial or w_farea = <result_fields>-func_area.
    count = count +1.
    else
    count =1.
    endif.
    w_farea= <result_fields>-func_area.
    <Result_fields>-zcount = count.
    ENDLOOP.
    Clear w_farea.
    This way the value of counter would get increased by one everytime a new text comes for the same functional area.We can have any number of texts for the same functional area by this method.
    The only issue would be when some delta update comes for the existing text in this method.If you are doing a full load to the DSO from the datasource, then there would be no issue with this method.

  • Open hub error when generating file in application server

    Hi, everyone.
    I'm trying to execute an open hub destination that save the result as a file in the application server.
    The issue is: in production environment we have two application servers, XYZ is the database server, and A01 is the application server. When I direct the open hub to save file in A01 all is working fine. But when I change to save to XYZ I´m getting the following error:
    >>> Exception in Substep Start Update...
    Message detail: Could not open file "path and file" on application server
    Message no. RSBO214
    When I use transaction AL11, I can see the file there in XYZ filesystem (with data and time correspondent to execution), but I can´t view the content and size looks like be zero.
    Possible causes I already checked: authorization, disk space, SM21 logs.
    We are in SAP BW 7.31 support package 6.
    Any idea what could be the issue or where to look better?
    Thanks and regards.
    Henrique Teodoro

    Hi, there.
    Posting the resolution for this issue.
    SAP support give directions that solved the problem. No matter in which server (XYZ or A01) I logon or start a process chains, the DTP job always runs in A01 server, and it causes an error since the directory doesn´t exist in server XYZ.
    This occurs because DTP settings for background job was left blank. I follows these steps to solve the problem:
    - open DTP
    - go to "Settings for Batch Manager"
    - in "Server/Host/Group on Which Additional Processes Should Run" I picked the desired server
    - save
    After that, no matter from where I start the open hub extraction, it always runs in specified server and saves the file accordingly.
    Regards.
    Henrique Teodoro

  • Open Hub Destination question

    Hi,
    We want to be able to extract data from a cube and would like to use Open Hub Destination for it. We also want to extract deltas.
    The question is ..
    Should we be making a single info package for this with delta capability ..
    or 2 separate info packages .. 1 for full mode and another for delta mode.
    Regards
    Vandana

    Hi,
    Yes what you said it is write
    from which cube do u want to retract the daat that cube will be the source
    first create the open hub in RSBO tcode and
    in open hub destination select aht and create the transformation between the table( to where u want to export)
    then create the DTP between the cube to the table
    and directly u can use the delta DTP first all the records will come and later onle delta records will be updated.
    Thansk & Regards,
    sathish

  • Open Hub 3rd Party - Process Chain Automation

    Hi,
    I have a requirement to fulfill the Open Hub extraction to 3rd party SQL Server DB using process chain automation.
    There are at times no deltas and during the execution of DTP since there are no delta's the request status shows GREEN but overall status as YELLOW because there are no new records.
    Until we clear this request to GREEN, we cant trigger the next load. I am doing these steps manually by going to tables/ FM's to change the status.
    I am looking for options for automation in process chain can someone help me on this request. Appreciate your help!
    Thanks,
    Pandu.

    Do you know when the delta will being zero records? Also when you say the overall status is yellow which status are you talking about?
    You can always use a ABAP program in your PC which will check for the latest request ID has records and turn the the status green so that dependent  data triggers are sent
    Thanks
    Abhishek Shanbhogue

  • Open Hub destinations

    I am using open hub destinations to transfer the cube data into database tables in BI.During activation of open hub the database tables with names /BIC/OH* are created. I am facing problem with the reference fields associated with this table.It is taking the field name itself as the reference field for Quantity and currency fields.Suppose I have a field CRM_NETVAM in table, the reference field shown for that in table is CRM_NETVAM only.This is the problem, so it wo'nt allow us to activate the databse table and ulltimately will be cause of failure of the open hub functionality.
    I need quick help for this and will give full points to anything which can help me out in solving this.

    Hello,
    Open hub service is to distibute data from BI system to non SAP systems. In open hub service for data export from BI system to non SAP system the central object is Infospoke and within infospoke BI Objects acts as open hub data source  from which data is extracted in full and delta mode and flat files and database tables acts as open hub destinations basically the target system into which data is transferred and data can be transferred using Business Add In (BADI)
    Please check this link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/c5/03853c01c89d7ce10000000a11405a/frameset.htm
    Hope it helps.
    Regards,
    Mona

Maybe you are looking for

  • Add to Library not working in iTunes 12?

    When I add songs to my iTunes library via "Add to Library", they do not appear in my library. I have quit iTunes, logged out, and still nothing appears!

  • IDoc to File[EDI] using seeburger BIC Mapping Tool

    Hi Gurus, I'm having a scenario of IDoc to EDI FIle, I'll be using IDoc DESADV.DELVRY03.ZDELVR0X; Kindly advise steps needed in BIC Mapping Tool. I've already referred to this post /people/dijesh.tanna/blog/2008/05/25/sap-pixi-content-conversion-usin

  • BADI/Exit for transaction FB05/F-36

    Can anybody help me regarding exit/BADI for transaction FB05/F-36? I have tried exits in CMOD such as: F050S001 FIDCMT, FIDCC1, FIDCC2: Edit user-defined IDoc segment F050S002 FIDCC1: Change IDoc/do not send F050S003 FIDCC2: Change IDoc/do not send F

  • Can't get authorized to use CS5; PS 6.0; PS 7.0 on new computer

    Can't get authorized to use CS5; PS 6.0; PS 7.0 on new computer.

  • Increasing the decay time of the exs24

    is there a way i can increase the decay time past 10000 ms so that I can hold my notes longer, this is really frustrating, i know there must be a simple workaround. theres no way they would make a sample with such a short max decay(i hope). thanks fo