Need a code for open hub destination

Hi Experts,
I have a requirement on open hub destination, as we are working on FI datasources,
while using open hub destination, it will create two files. one is structure file and another is header file,
nw my client is asking us to combine these both files into a same file while using open hub destination..
could you please explain me hw to achieve this situation
plz share the CODE ...
Thanks in advance !!!!
Regards
sinu reddy

ghhyh

Similar Messages

  • Multiprovider as Datasource for Open Hub Destination

    Hello,
    Can you pls let me know if we can used the Multiprovider as datasource for the Open Hub destination. Currently we are Bi7.0 and SP9. I am not able to see the option of Mutiprovider in the template.
    Appreciate the input ASAP.
    Thanks
    Gopal

    Multiprovider is not yet supported for Open Hub destination and hence you dont see that option.
    As of SAP NetWeaver 2004s SPS 6, the open hub destination has its own maintenance interface and can be connected to the data transfer process as an independent object. As a result, all data transfer process services for the open hub destination can be used.  You can now select an open hub destination as a target in a data transfer process.  In this way, the data is transformed as are all other BI objects.
    In addition to the InfoCube, InfoObject, and DataStore object, you can also use DataSources and InfoSources as templates for the field definitions of the open hub destination.
    The open hub destination now has its own tree under Modeling in the Data Warehousing Workbench. This tree is structured by InfoAreas.
    <b>Restrictions -</b>
    It is not currently possible to use MultiProviders as the source for the open hub destination.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/7c836c907c7103e10000000a1553f7/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Help required for Creation of process chain for Open hub destination(table)

    Hi Experts,
    I am working on creating process chain for the open hub destination which is targeting data transfer into Table. Which is then push to SAP PI.
    2.) Steps to create a process chain for OHD????? Not knowing and even after that
    3.) What are the steps for Pushing it to PI (I mean we have generated client proxy also) But need to know ABAP related to trigger process chain
    please need your expertise and knowledge. Well atleast with the point 2 I need BI experts help...
    Regards,
    [Gaurav Patwari|http://gauravpatwari.wordpress.com]

    Hi Gaurav,
    Creating Process chain for OHD.
    1. Go to RSPC-> Create a astart variant.
    2. Click on process typesin the menu. Fourth from the left. Under this go to LOAD PROCESSES AND POST PROCESSING option.
    3. The fifth option is "DATA EXPORT INTO EXTERNAL SYSTEMS". Click on that.
    4. Click on F4 and choose your infospoke name.
    5. Press the RIGHT button.(if it asks for variant, create a new variant in the same small window besideswritting bar).
    6. Join the START process and the 2nd process via dragging.
    7. Now add your required other processes.
    The general process chain processes are :
    1. Start.
    2. Delete Indexes (if data loaded to cube).
    3. Load process.
    4. Delete duplicate request (if it is a full load in cube).
    5. Create Indexes.
    Hope it helps.
    Preet

  • DTP Delta request for open hub destination.

    Hi expert,
    As you know, In BW 7.0, open hub destination is used for open hub service, and you can create tranformation/DTP for the destination. and in some case the DTP delta is enabled.
    my questions are
    1) in the context menu of open hub destination, I did not find something like manage, which could list all the DTP request executed for this open hub destination..but for normal infocube or ods, you can find a list.
    2) if the DTP could enable the delta, where can I delete the latest delta request, and have a new load..
    3) I assume the DTP could be excuted with batch integrated with process chain by process type 'DTP', right ? if the destination type is third party tool, can it also be integrated with process chain ?
    4) can give some obvious difference of InfoSpoke and open hub destination in terms of advantage and disadvantage..
    Thanks a lot for the feedback.
    Best Regards,
    Bin

    Hi ,
    Please look at the links below . hope it helps .
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d063282b-937d-2b10-d9ae-9f93a7932403?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9?QuickLink=index&overridelayout=true
    Regards
    Santosh

  • Inconsistency with ZPMSENDSTATUS for Open Hub Destination

    Hi Experts,
    I dont know you have come accross this scenario but I will apprecaite all your advice.
    We have APO 7.0 and we are using the BW component to extract data to informatica.
    Our environment is as follow
    SAP SCM 7.0 SP8
    Informatica PWCD 8.1.6 hot fix 13
    1.we have created 5 chains with similar variants for ZPMSENDSTATUS
    2. Each variant relates to a folder and workflow to be executed within informatica
    3. When we execute the chains only two work correctly all the time and 3 fail
    the 3 that are failing are failing within the ZPMSENDSTATUS when running API RSB_API_OHS_DEST_SETPARAMS. This is odd since the other two only differ on the workflow to be executed and obviously on the Open Hub destination. Can anyone provide some advice on this?
    thanks,
    Nat

    Hi Maruthi,
    thanks for your response but here are the details for the variants that are working and failining
    working:
    DEST: < here we have the RFC that relates to the logical system and the program id which is also included in saprfc.ini>
    INFPARA: AERO/APO:WF_SAP_APO_PART_PLATFORM_ABP2_BGA:S_M_SAP_APO_PART_PLATFORM_ABP2
    CONTEXT: OHS API
    OHDEST: <OHDS defined as target>
    Failing:
    DEST: < here we have the RFC that relates to the logical system and the program id which is also included in saprfc.ini>
    INFPARA: AERO/APO:WF_SAP_APO_PART_PLATFORM_ABP4_ATR:S_M_SAP_APO_PART_PLATFORM_ABP4
    CONTEXT: OHS API
    OHDEST: <OHDS defined as target>
    So you see, they all target the same folder only different workflows. Still it works for the first and not for the second.

  • Need transaction code for opening Bex query designer?

    Hi ALL,
      Can anyone help me with the transaction code for opening Bex query designer?
      Like RRMX for Bex Analyzer similarly i want for Bex query designer.
    Thanks & Regards
    Sameer Khan

    THERE IS NO TCODE FOR QUERY DESIGNER. BUT THERE IS AN ALTERNATIVE..
    OPEN BEX ANALYSER THROUGH RRMX
    OPEN A QUERY
    CLICK ON TOOLS-> EDIT QUERY
    THIS OPENS QUERY DESIGNER AND YOU CAN USE IT NORMALLY..

  • Issue for DTP from DSO to open hub destination

    Hello Gurus,
            I have a issue for DTP from DSO to open hub destination, long text for error in the monitor is as follows:
              " Could not open file
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2 on application server"
              " Error while updating to target ZFIGLH03 (type Open Hub Destination)     "
          for open hub destination, I check the configure for logical file name , which is "
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr",
    I am wondering where that file " 
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2" in the error message comes from?
    Many thanks,

    Hi
    You do not need to create a file in application server. It will be created automatically.
    But if you have defined a logical file name in tcode FILE and used that in OHD and if it is not correct then it will show a conflict. Check this out.

  • Open hub destination of text and attribute for Functional Area

    Hi Expert
    I have a requirement to accommodate both attribute ( 0FUNCT_LOC_ATTR)  and text ( 0FUNCT_LOC_TEXT ) for functional area in open hub destination .For this a DSO has been created from both 0FUNCT_LOC_ATTR, 0FUNCT_LOC_TEXT data sources .Problem is that DSO has 0FUNCT_AREA ( Functional Area ) as only key fields . So text for both english ( EN ) and Dutch ( NL) for particular 0FUNCT_AREA not getting loaded .If we assign both 0LANGU and 0FUNCT_AREA as composite key of dso , data from attribute transformation is loaded as blank as there is no language filed in attribute data source ( 0FUNCT_LOC_ATTR ).   Please let me know how can I design to accommodate both text for different language and functional area for open hub destination  .
    I know that I can create separate open hub destination for master data TEXT and Attribute.
    Any help is highly appreciated.
    Regards
    Saikat

    HI Saikat,
    You can create an additional characteristics counter and make it a key field in the DSO.After that we can populate this characteristics in the end routine .It value would increase by 1 everytime  different text comes for the same functional area.
    Sample code
    data: counter type n,
             w_farea type /bio/oifunc_area.
    Sort result_package by func_area.
    count =0.
    *checking whether we have the same functional area or we have a different functional area.
    Loop at result_package assigning <Result_fields>.
    if w_farea is initial or w_farea = <result_fields>-func_area.
    count = count +1.
    else
    count =1.
    endif.
    w_farea= <result_fields>-func_area.
    <Result_fields>-zcount = count.
    ENDLOOP.
    Clear w_farea.
    This way the value of counter would get increased by one everytime a new text comes for the same functional area.We can have any number of texts for the same functional area by this method.
    The only issue would be when some delta update comes for the existing text in this method.If you are doing a full load to the DSO from the datasource, then there would be no issue with this method.

  • Multiple data sources to open hub destination ??

    Hi,
            Is it possible to assign multiple data sources to open hub destination?
    At present we are FTP'ing a file to users. Users need one more field added to this file. The information for this field belongs to another ODS. I  checked the infospoke but it does not accept multiple data sources.
    Any ideas?
    Thanks!
    Edited by: BI Quest on Sep 4, 2008 8:12 PM

    You can create multiple transformations for open hub destination. Create destination of open hub destination to be file on application server and then use transfer mechanism to transfer files to desired destination.
    Regards,
    Pravin Karkhanis.

  • Open hub destination with datasource as the source

    Hi,
    I am trying to use the new features of BI for open hub destination where you can use datasource as the source for extraction.
    I have created an infopackage, loaded data to PSA and from there to open hub destination via the DTP.
    When I try to execute the DTP, I get the following error:
    <b>Datapackage processing terminated.
    Error in substep
    You are not authorized to use this transformation.</b>
    I have been able to use the DTP to load open hub destination targets with datastore and infocubes as source.
    Also when we turn on the security trace, it doesn't fail anywhere.
    Has anybody faced a similar problem?
    Any help appreciated,
    Thanks,
    Payal.

    I have created to OHD - one for a 3.5 data source and one for a 7.0 data source (both are different data sources). While creating DTP for the 7.0 data source, it did create a default mapping for transformation.
    But I still get same error in both cases.
    When I create the DTP, it tries to create a transformation. But I get an information message :
    Exception CX_RS_MSG occurred (program: CL_RSBK_PATH==================CP, include: CL_RSBK_PATH==================CM00I, line: 16).
    I can still activate the DTP and when I run it says I don't have authorization to this transformation.
    I think there are issues while creating the transformation, but I don't know what!
    -Payal.

  • Open Hub Destination transports

    Hi Experts..
    We have crated Open Hub destinations to pull data from ODS to DB table in BI 7.0. But while transport the transformations for OHD are failing with transformation becoming inactive with error messageu201D Target ODSO ZODS*** does not exist" But the ODS is present in QA system and is in active version.
    So can any one pls help me to know the proper sequence for Open Hub Destination transport and the necessary objects to be captured in the request? Currently I had captured OHD, transformation and DTP in the same request.
    <removed by moderator>
    Regards,
    Umesh.
    Edited by: Siegfried Szameitat on Jan 16, 2009 11:57 AM

    Send the OHD in one request, transformations in one request, then DTP in another request.
    When club transformations & DTP together it fails in my experience.

  • Open Hub Destination to Informatica

    HI All,
    I am having the following issue. We have created Open Hub Destination on InfoCube for Third Party (Informatica). But the target table is not visible in Informatica. They were able to view if created using InfoSpokes and able to extract using InfoSpokes. But when we use Open Hub Destination, they are not able to view them. We verified the RFC connection. Also triggered DTP for Open Hub Destination and it is sending data to the target teable structure.
    Please help me in this regard
    Regards
    YR

    Did you use Informatica's SAP Connector? It is a add-on and licese has to be purchased from Informatica. this will help to load delta and continuous daily load from SAP BI to target system with Informatica as ETL. InfoSpoke is a .csv file that created in a particuar folder thus Informatica able to see it.
    Thanks,
    S

  • Open Hub Destination with Flat File - Structure file is not correct

    Using the new concept for Open Hub destination (NOT an InfoSpoke).
    Output is to a Flat File. Remember that when output is to a flat file 2 files are generated: the output file and a structure file (S_<name of output file>.CSV)
    The Output file structure corresponds to the structure defined in the Open Hub destination.
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    I have not found any settings that I can modify for the S_ file. When using InfoSpokes the S_ file is correct and I do not have to do anything.
    The issue is that the S_ file cannot be used to describe the output file. This is a problem.
    Anyone knows a solution to this issue? The goal is to get the S_ file to reflect the proper structure.
    I could not find any Notes on this. Please do not send general help links, this is a specific question.
    Thank you all for your support.

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • Open hub destination ---Additional property needed--

    Hi,
    I have one open hub destination object which is writing data from infocube to an external file (in appliaction server ) and the file is distributed to customers via Email---
    This is absolutely working fine.
    Now, i also need to extract the data from other datasource to the same cube ( which are nothing but corrections to original data existing in cube ) and the corrected data needs to be updated to the file in applucation server!!
    as overwriting is not possible in infocubes i wonder how i could make the corrected data to be updated to the external files

    Hi,
    Try to load DELTA to the cube from the datasources. By this only the changes will be added to the already existing records.
    Else load the data from the cube to a new DSO and the load the data from other datasources to the same DSO. The latest correct data from other datasources will over-erite the existing data in the DSO.
    Use this new DSO in the openHub.
    Regards,
    Balaji V

  • Modifying destination Directory of Application server for Open Hub

    Hi All,
    i want to load a csv file on application server using Open Hub.
    when i have created the open hub i have specified the server name, the file name and the directory.
    but the Basis guys told me that i should use another directory, they have created me a Unix Directory but the problem that i can't modify the directory path in my open hub.
    can you tell me please what is the problem?
    another question why i can't modify directly in Production System this parameter of server name, directory for open hub,
    in production i can create the open hub but when i have transported some one from dev to prod i can't modify in prod, i have controlled from transport tool the possibility of modication object and i see that for opne hub is "original modificable"
    Thanks for your help
    Bilal

    i have read in some forum that to " to change the server name or logical file name we can do so in the following table:
    RSBFILE"
    i tried to see the data of this table and i found 2 records for my open hub, 1 for Active version and 1 for Modified version
    i tried here to change the path name of the directory and after saving the system give me the message that the active version and the modified version of my open hub are not equal, when i try to activate again my open hub the system take the old
    directory path.
    is this is the right table to change the directory path on application server of my open hub? or there is another table?
    i don't work with logical filename, i work with file name and directory path name.
    thanks for your help
    Bilal

Maybe you are looking for