Populate text description in the open hub destination

Hi SDNers,
I have requirement with open hub destination, i am taking the cube data to SQL server database.
in that scenario i need text description in the cube ,but we can not see the text description data in the cube(because cube can hold dimension data and utmost navigational attributes data but no textual description).
Text data lies out side the cube but i need couples of text description for master data texts.
Do i need to write the code to populate(look-up) the text description from text table.
Can any one suggest an idea.
Thanks,
Satya

Yes, you may want to write a code on transformations from Cube to Open hub to go and get text from Text table of that Info object.
Make sure to create a new IO of lenght equal to Text field in IO and add this new IO in the Open hub and in transformations write a code at field  like:
select single TXTSH from /BIC/Txxxxx into wa_text where
key_of_p_table = /bic/source_field
and langu = 'EN'
If sy-subrc = 0.
result = wa_text.
endif.
clear:wa_text

Similar Messages

  • Open hub destination to download the output file into R/3

    Hi Guys,
                 I have a requirement where i need to down load the data from my BW cube to R/3.For this i am using Open hub destination.
    While creating the open hub destination we have option called RFC Destination(in that i have selected destination as my R/3 system).When i execute my DTP it is saying the load is successful and it is showing the no.of records transfered.
    But i am not able to find where this data is getting transfered From BW cube to R/3 system using open hub.

    Hi Friends,
    I have one Query on WDJ. Here I have two requirements.
    I have done click on u201CExport Excelu201D Button that data was download into Excel file Now my requirement is click on Save Button that file will save into (/exchange/CED) this path of R/3 System.
    CED -
    >is R/3 System ID
    Under System id we can save this file.
    Here EP Server and R/3 Servers in Diff Systems.
    I was write codeing
    InputStream text =      null;
         int temp = 0;
         try
         File file =      new File(wdContext.currentContextElement().getResource().getResourceName().toString());
    //       String File = "abc.xls";
    //       String file = "anat" + ".xls";
         wdComponentAPI.getMessageManager().reportSuccess("File::"+file);
         FileOutputStream op =      new FileOutputStream(file);
    //       FileOutputStream op = new FileOutputStream("D://usr//sap//BPE//"+file);
    //       FileOutputStream op = new FileOutputStream("D://KumarDev-BackUp//"+file);
         wdComponentAPI.getMessageManager().reportSuccess("op::"+op);
         if (wdContext.currentContextElement().getResource()!= null)
         text = wdContext.currentContextElement().getResource().read(false);
         while((temp=text.read())!= -1)
         op.write(temp);
         op.flush();
         op.close();
    //        path = file.getAbsolutePath();
         path =      "/exchange/CED/"+file;
         wdComponentAPI.getMessageManager().reportSuccess("path:"+path);
         catch(Exception ex)
              wdComponentAPI.getMessageManager().reportSuccess("ex.printStackTrace()");
    My Req is that file will saved in this path (path =      "/exchange/CED/"+file; )
    Regards
    Vijay Kalluri

  • Open Hub destination in BI 7.0

    I am trying to create an open hub destination to extract data from an mutiprovider to BPC, when I get the step where I need to assign the infocube to the open hub destination, I cannot find the mutiprovider from the drop down menu, is this means that open hub cannot use the mutiprovider as the data source?

    Hi,
    Infocubes, DataStore Objects or Infoobjects(attributes or texts) can be used as open hub data sources.
    Please refer below link:
    http://help.sap.com/saphelp_nw70ehp1/helpdata/en/ce/c2463c6796e61ce10000000a114084/frameset.htm
    Thanks,
    Rema

  • Error in Transporting Open Hub Destination Objects

    Hi Experts,
    When i transport the open-hub destination to Quality server from development, the error No Rule Type Exists occurs while activating the transformation.
    The objects are transported but transformation & DTPs are inactive.
    Please assist as this is for the first time Open-Hubs are being transported in my system.
    Any help will be greatly appreciated.
    Thanks,
    Sumit

    Hi,
    the return code is 8
    I checked and here is error log:-
    Start of the after-import method RS_TRFN_AFTER_IMPORT for object type(s) TRFN (Activation Mode)    
       Activation of Objects with Type Transformation                                                      
       Checking Objects with Type Transformation                                                          
       Checking Transformation 08FM6JIPU6MCLHM2LWBY02OT6SXOPDNY                                            
       No rule exists                                                                               
    Checking Transformation 09RS1PONVSGIHIJEHRGDCWPP4UD87TBY                                            
       No rule exists                                                                               
    Checking Transformation 0MJTLNXZAUI7SKRWPKEFUIOCMB8X5TWQ                                            
       No rule exists                                                                               
    Saving Objects with Type Transformation      
    Thanks,
    Sumit

  • Erase data from open hub destination data service

    Hi,
    I'm getting an error trying to delete data from a cube, for which data are extracted using open hub service and loaded into a table.
    The error given is the following:
    Request updated in 1794 already by 1795 target DTPR_4EEX9CGGXLTU2QIPWUAIEUDZK DTP request (ZRAPPEL)
    Message no. RSM077
    Diagnosis
    One or more DTP requests have already updated the request.
    Procedure
    Firstly, delete all update requests from your targets. Then you can delete the source request.
    Zrappel data is the open hub destination. But it is clear where requests for this destination, to let me delete the cube data.
    Thanks.

    Hi,
    Procedure
    Firstly, delete all update requests from your targets. Then you can delete the source request.
    Follow the same steps above , that means delete the requests from the Cube Zrappel.
    right click on the cube -
    > Delete Data
    now the syatem will promt you asking
    Do you want to delete the content of data target Zrapple ?
    say yes for it .
    hope this will solve your problem
    santosh

  • Program or Function module to delete data from Open Hub Destination Table

    Hi All,
    Can anybody suggest me a Program or Function module to delete data from Open Hub Destination Table.
    Thanks & Regards,
    Vinay Kumar

    You can simply goto t-code SE14 mention the open hub destination table and Delete data by clicking on "Activate and Adjust database" with radio button "Delete Data".
    Regards,
    Arminder

  • Open Hub Destination to Oracle database

    Hello All ,
    I am supposed to send data from my SAP BI (7.0) system to and Oracle Database using open hub destination. I have to use the option " Third Party " as destination while creating the Open Hub Destination .
    From my reading and acc to Basis I  realised that I need a 3rd party ETL tool in between BI and Oracle system to write data into the Oracle system since oracle is not SAP RFC aware . The 3rd party ETL that I am supposed to use in this is Business Objects - Data Integrator(BO-DI) . If anyone has implemented this type of Open hub service please throw  some light on this issue as to how to setup the 3rd party ETL in between and how to proceed .  Any how to document would also be useful .
    Thanks and Regards,
    Riddhi

    Hi,
    Pls go thru this links,
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/79f902dfb06fc9e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/79f902dfb06fc9e10000000a1553f6/frameset.htm
    Hope it helps u,
    Thanks & Regards,
    SD

  • Open hub Destinations - Third-Party Tools As Destinations

    Dear all, below is the SAP Help process flow steps. I am quiet clear of all the steps except the APIs. Where do we define these APIs? When it says "You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction" where do I use it? Is it in the 3rd party tool? or is it in SAP BI.
    If it is in 3rd party tool, where in 3rd party tool. Do I need to write as code ? or would I have an option in the 3rd party tool?
    If someone can help me with Microsoft connector using Microsoft SQL Server 2008 Integration Services packages to SAP BI that would be helpful.
    Where in Microsoft SQL Server 2008 Integration Services packages can I use/put this APIs? Any help Plzz
    If it is in SAP BI where in SAP BI do I have to use these APIs? Any help is highly appreciated
    Regards,
    KK
    Process Flow:
    Extraction to the third-party tool can be executed as follows:
           1.      You define an open hub destination with Third-Party Tool as the destination type.
           2.      You create an RFC destination for your third-party tool and enter it in the definition of the open hub destination.
           3.      You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction.
           4.      You either start extraction immediately or include it in a process chain. You can also start this process chain from the third-party tool using process chain API RSPC_API_CHAIN_START. The extraction process then writes the data to a database table in the BI system.
           5.      When the extraction process is finished, the system sends a notification to the third-party tool using API RSB_API_OHS_3RDPARTY_NOTIFY.
           6.      The extracted data is read by API RSB_API_OHS_DEST_READ_DATA.
           7.      The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.

    It's been a along time since i worked with open hub, but here goes:
    The Third Party tool needs parameters or switches to work correctly. Think of the old DOS commanda the  "/a" "/b" suffix.
    The APIs are simply a way to pass these parameters from the BW source to the Third Party target.
    If your tool does not need them, don't worry about them.
    John Hawk

  • Inconsistency with ZPMSENDSTATUS for Open Hub Destination

    Hi Experts,
    I dont know you have come accross this scenario but I will apprecaite all your advice.
    We have APO 7.0 and we are using the BW component to extract data to informatica.
    Our environment is as follow
    SAP SCM 7.0 SP8
    Informatica PWCD 8.1.6 hot fix 13
    1.we have created 5 chains with similar variants for ZPMSENDSTATUS
    2. Each variant relates to a folder and workflow to be executed within informatica
    3. When we execute the chains only two work correctly all the time and 3 fail
    the 3 that are failing are failing within the ZPMSENDSTATUS when running API RSB_API_OHS_DEST_SETPARAMS. This is odd since the other two only differ on the workflow to be executed and obviously on the Open Hub destination. Can anyone provide some advice on this?
    thanks,
    Nat

    Hi Maruthi,
    thanks for your response but here are the details for the variants that are working and failining
    working:
    DEST: < here we have the RFC that relates to the logical system and the program id which is also included in saprfc.ini>
    INFPARA: AERO/APO:WF_SAP_APO_PART_PLATFORM_ABP2_BGA:S_M_SAP_APO_PART_PLATFORM_ABP2
    CONTEXT: OHS API
    OHDEST: <OHDS defined as target>
    Failing:
    DEST: < here we have the RFC that relates to the logical system and the program id which is also included in saprfc.ini>
    INFPARA: AERO/APO:WF_SAP_APO_PART_PLATFORM_ABP4_ATR:S_M_SAP_APO_PART_PLATFORM_ABP4
    CONTEXT: OHS API
    OHDEST: <OHDS defined as target>
    So you see, they all target the same folder only different workflows. Still it works for the first and not for the second.

  • List of Open Hub Destination

    Hi All,
    Is there a table which list out all the open hub destination?
    Saurabh.

    Hi Saurabh,
    Table for Open Hub Services are :-
    RSBOHDEST : Directory of Open Hub Destinations
    RSBOHFIELDS : Fields of the Open Hub Destination
    Some FM are also available. It might be of help to you.
    RSB_API_OHS_DEST_GETLIST
    RSB_API_OHS_DEST_GETDETAIL
    Roy

  • Multiprovider as Datasource for Open Hub Destination

    Hello,
    Can you pls let me know if we can used the Multiprovider as datasource for the Open Hub destination. Currently we are Bi7.0 and SP9. I am not able to see the option of Mutiprovider in the template.
    Appreciate the input ASAP.
    Thanks
    Gopal

    Multiprovider is not yet supported for Open Hub destination and hence you dont see that option.
    As of SAP NetWeaver 2004s SPS 6, the open hub destination has its own maintenance interface and can be connected to the data transfer process as an independent object. As a result, all data transfer process services for the open hub destination can be used.  You can now select an open hub destination as a target in a data transfer process.  In this way, the data is transformed as are all other BI objects.
    In addition to the InfoCube, InfoObject, and DataStore object, you can also use DataSources and InfoSources as templates for the field definitions of the open hub destination.
    The open hub destination now has its own tree under Modeling in the Data Warehousing Workbench. This tree is structured by InfoAreas.
    <b>Restrictions -</b>
    It is not currently possible to use MultiProviders as the source for the open hub destination.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/7c836c907c7103e10000000a1553f7/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Problem in transporting Open Hub destination

    Hi,
    I have a open hub destination. The destination is a database table. When i try to  transport the open hub destination to staging, the transport fails saying "Unable to activate table /BIC/OHXXX" . What could be the reason ?
    Thanks,
    Satya

    Hi Satya,
    I have exactly the same problem. The reason why the table could not be activated is because the referenced fields of the open hub table are wrong. Unfortunately I do not know at the moment why this is done. It would be great if could post the solution if you solved the problem. I will keep you informed...
    Thanks,
    Thomas

  • Open Hub Destination-Third Party Tool

    dear all,
    can any one guide me as to how data(master data in particular) from bw can be transferred to oracle data base using open hub destiantion.is rfc connection between bw and third part tool enough,do we have to maintain the same structure in third party tool before executing DTP and how can this be done??
    i am now able to transfer data from bw into a flat file and also to data base (bw) and not aware of the this functionality-third party tool .any light thrown on this area is welcome and points will be awarded ,
    thanks ,
    sasidhar gunturu

    Some light at the end of tunnel for sure...
    Extraction to the third-party tool can be executed as follows:
           1.      You define an open hub destination with Third-Party Tool as the destination type.
           2.      You create an RFC destination for your third-party tool and enter it in the definition of the open hub destination.
           3.      You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction.
           4.      You either start extraction immediately or include it in a process chain. You can also start this process chain from the third-party tool using process chain API RSPC_API_CHAIN_START. The extraction process then writes the data to a database table in the BI system.
           5.      When the extraction process is finished, the system sends a notification to the third-party tool via API RSB_API_OHS_3RDPARTY_NOTIFY.
           6.      The extracted data is read by API RSB_API_OHS_DEST_READ_DATA.
           7.      The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.
    For more details click below -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/7a69d9f3897103e10000000a1553f7/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open Hub Destination - csv files

    Hi all,
    I nedd to send information through Open Hub Destination from a cube to a File in a server. But the Open Hub Destination is originating 2 csv files instead of 1: one of the files has just the name of the fields that will be in the columns and the other file only the data in the columns.
    There is an alternative to generate through Open Hub Destination, only one file, with the headers in the columns?
    Can you explain to me how to do it?
    There are other options to generate csv files, automaticaly from cubes,and send them to other applications?
    Thanks a lot.
    Regards.
    Elisabete Duarte

    Hi,
    To send information from a cube to applicatin server thru OHS,
    two files will be formed while using OHS. One which will be saved inthe aplication server and for other u should give the path. For downloadin the file to the applc server, in the  destination tab, select applcn server option and transform it
    Hopeit helps
    Cheers
    Raj
    Message was edited by:
            Raj

  • Open Hub Destination with Flat File - Structure file is not correct

    Using the new concept for Open Hub destination (NOT an InfoSpoke).
    Output is to a Flat File. Remember that when output is to a flat file 2 files are generated: the output file and a structure file (S_<name of output file>.CSV)
    The Output file structure corresponds to the structure defined in the Open Hub destination.
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    I have not found any settings that I can modify for the S_ file. When using InfoSpokes the S_ file is correct and I do not have to do anything.
    The issue is that the S_ file cannot be used to describe the output file. This is a problem.
    Anyone knows a solution to this issue? The goal is to get the S_ file to reflect the proper structure.
    I could not find any Notes on this. Please do not send general help links, this is a specific question.
    Thank you all for your support.

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

Maybe you are looking for