Open hub destination issue

Hi,
In our project we have Client Instance (eg: BWPD) and Application server (eg: BWPRD) defined for load balancing.
We have created open hub and assigned destination server BWPD.
When i execute the DTP manually in BWPD, it runs succesfully.
However, the same DTP when placed in the process chain fails with error message :
No Such File or Directory
Could not open file D:\usr\sap\BWPD\A01\work\Material on application server
Error while updating to target ZXXXX.
Options Tried:
Schedule process chain in the background server BWPD (same server which has been mentioned in Open hub dest) still DTP failed.
Tried with Application server it failed.
Tried with HOST as option it failed.
couldn't make out what is going wrong. Any thoughts ?
Regards.

Hi there,
found that doc quite useful.
Maybe could shed some light to your issue.
[Creating  Open Hub Destination  using a Logical file to extract the data|Creating  Open Hub Destination  using a Logical file to extract the data]
Also, what OS do you have?
is the Syntax Group accordingly created ?

Similar Messages

  • Issue for DTP from DSO to open hub destination

    Hello Gurus,
            I have a issue for DTP from DSO to open hub destination, long text for error in the monitor is as follows:
              " Could not open file
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2 on application server"
              " Error while updating to target ZFIGLH03 (type Open Hub Destination)     "
          for open hub destination, I check the configure for logical file name , which is "
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr",
    I am wondering where that file " 
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2" in the error message comes from?
    Many thanks,

    Hi
    You do not need to create a file in application server. It will be created automatically.
    But if you have defined a logical file name in tcode FILE and used that in OHD and if it is not correct then it will show a conflict. Check this out.

  • Open Hub Destination with Application Server Issue

    Hi all,
    I have requriment that i need to save my monthly data in application server with the help of OPen Hub Destination.I have created an DSO
    on that DSO i have created an Open HUb with application server as target.When i store my every month file in application server that file name should be a per the system date.
    So i have Created an LOgical File----ztest_data
    Assgined Physical path to Logical File-----/usr/sap/<sysid>........
    created an Logical File name------Ztest_data_file....
    IN the open HUb i have selected Aplication Server as My target and i have checked the box.
    I have selected Logical FIle name and given the file name which i have created-----ztest_data_file...
    Runned DTP it is showing no errors,but when i check in AL11 ,in the path which i have given(/usr/sap/<sysyid>........,i couldnt find the file.....
    Please check this and let me know...
    Thanks In Advance,
    Bobby.

    Hi,
    Please check the link /people/jyothi.velpula/blog/2010/01/20/creating-open-hub-destination-using-a-logical-file-to-extract-the-data .
    Also is your DTP full or delta. Try running the DTP as Full.
    Hope it helps.
    Best Regards,
    Kush Kashyap

  • Open hub destination with datasource as the source

    Hi,
    I am trying to use the new features of BI for open hub destination where you can use datasource as the source for extraction.
    I have created an infopackage, loaded data to PSA and from there to open hub destination via the DTP.
    When I try to execute the DTP, I get the following error:
    <b>Datapackage processing terminated.
    Error in substep
    You are not authorized to use this transformation.</b>
    I have been able to use the DTP to load open hub destination targets with datastore and infocubes as source.
    Also when we turn on the security trace, it doesn't fail anywhere.
    Has anybody faced a similar problem?
    Any help appreciated,
    Thanks,
    Payal.

    I have created to OHD - one for a 3.5 data source and one for a 7.0 data source (both are different data sources). While creating DTP for the 7.0 data source, it did create a default mapping for transformation.
    But I still get same error in both cases.
    When I create the DTP, it tries to create a transformation. But I get an information message :
    Exception CX_RS_MSG occurred (program: CL_RSBK_PATH==================CP, include: CL_RSBK_PATH==================CM00I, line: 16).
    I can still activate the DTP and when I run it says I don't have authorization to this transformation.
    I think there are issues while creating the transformation, but I don't know what!
    -Payal.

  • Open Hub Destination to Oracle database

    Hello All ,
    I am supposed to send data from my SAP BI (7.0) system to and Oracle Database using open hub destination. I have to use the option " Third Party " as destination while creating the Open Hub Destination .
    From my reading and acc to Basis I  realised that I need a 3rd party ETL tool in between BI and Oracle system to write data into the Oracle system since oracle is not SAP RFC aware . The 3rd party ETL that I am supposed to use in this is Business Objects - Data Integrator(BO-DI) . If anyone has implemented this type of Open hub service please throw  some light on this issue as to how to setup the 3rd party ETL in between and how to proceed .  Any how to document would also be useful .
    Thanks and Regards,
    Riddhi

    Hi,
    Pls go thru this links,
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/79f902dfb06fc9e10000000a1553f6/frameset.htm
    http://help.sap.com/saphelp_nw70/helpdata/EN/43/79f902dfb06fc9e10000000a1553f6/frameset.htm
    Hope it helps u,
    Thanks & Regards,
    SD

  • Open Hub Destination over a Datasource as a source

    Hi Expert,
                    I am trying to create Open hub destination over Datasource as a source
    it showing me an error "Cannot connect DataSource to an open hub destination".
    Can anyone tell me is it possible to create an open hub destination over a datasource
    If yes then how?
    Thanks and Regards
    Lalit Kumar

    HI lalit,
    Check the below forum for refer they have refer using datasource as source for OHD m nt sure:
    Open hub destination with datasource as the source
    OHD ISSUE
    Thanks,
    Deepak
    Edited by: Deepak Machal on Dec 15, 2011 11:04 AM

  • Open hub destination of text and attribute for Functional Area

    Hi Expert
    I have a requirement to accommodate both attribute ( 0FUNCT_LOC_ATTR)  and text ( 0FUNCT_LOC_TEXT ) for functional area in open hub destination .For this a DSO has been created from both 0FUNCT_LOC_ATTR, 0FUNCT_LOC_TEXT data sources .Problem is that DSO has 0FUNCT_AREA ( Functional Area ) as only key fields . So text for both english ( EN ) and Dutch ( NL) for particular 0FUNCT_AREA not getting loaded .If we assign both 0LANGU and 0FUNCT_AREA as composite key of dso , data from attribute transformation is loaded as blank as there is no language filed in attribute data source ( 0FUNCT_LOC_ATTR ).   Please let me know how can I design to accommodate both text for different language and functional area for open hub destination  .
    I know that I can create separate open hub destination for master data TEXT and Attribute.
    Any help is highly appreciated.
    Regards
    Saikat

    HI Saikat,
    You can create an additional characteristics counter and make it a key field in the DSO.After that we can populate this characteristics in the end routine .It value would increase by 1 everytime  different text comes for the same functional area.
    Sample code
    data: counter type n,
             w_farea type /bio/oifunc_area.
    Sort result_package by func_area.
    count =0.
    *checking whether we have the same functional area or we have a different functional area.
    Loop at result_package assigning <Result_fields>.
    if w_farea is initial or w_farea = <result_fields>-func_area.
    count = count +1.
    else
    count =1.
    endif.
    w_farea= <result_fields>-func_area.
    <Result_fields>-zcount = count.
    ENDLOOP.
    Clear w_farea.
    This way the value of counter would get increased by one everytime a new text comes for the same functional area.We can have any number of texts for the same functional area by this method.
    The only issue would be when some delta update comes for the existing text in this method.If you are doing a full load to the DSO from the datasource, then there would be no issue with this method.

  • Open Hub Destination to Informatica

    HI All,
    I am having the following issue. We have created Open Hub Destination on InfoCube for Third Party (Informatica). But the target table is not visible in Informatica. They were able to view if created using InfoSpokes and able to extract using InfoSpokes. But when we use Open Hub Destination, they are not able to view them. We verified the RFC connection. Also triggered DTP for Open Hub Destination and it is sending data to the target teable structure.
    Please help me in this regard
    Regards
    YR

    Did you use Informatica's SAP Connector? It is a add-on and licese has to be purchased from Informatica. this will help to load delta and continuous daily load from SAP BI to target system with Informatica as ETL. InfoSpoke is a .csv file that created in a particuar folder thus Informatica able to see it.
    Thanks,
    S

  • Open Hub Destination with Flat File - Structure file is not correct

    Using the new concept for Open Hub destination (NOT an InfoSpoke).
    Output is to a Flat File. Remember that when output is to a flat file 2 files are generated: the output file and a structure file (S_<name of output file>.CSV)
    The Output file structure corresponds to the structure defined in the Open Hub destination.
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    I have not found any settings that I can modify for the S_ file. When using InfoSpokes the S_ file is correct and I do not have to do anything.
    The issue is that the S_ file cannot be used to describe the output file. This is a problem.
    Anyone knows a solution to this issue? The goal is to get the S_ file to reflect the proper structure.
    I could not find any Notes on this. Please do not send general help links, this is a specific question.
    Thank you all for your support.

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • Use of Open Hub Destination to load data into BPC

    Hi Gurus,
    I want to load the data from SAP BI to BPC (NW version), using Open Hub Destination. I want to know few things about this.
    1) What method should I use? Should I use Destination as Flat Files or Database tables? If Database tables, how can I use this data in the tables in BPC?
    2) If I go for Flat Files, which  is saved in the application server of BI, how can I import those files to BPC and use it using Data Manager?
    3) Also, in case of Flat Files, there are two files which are created. One is the control file and other one is the data file. How can I use both of them? Or can I just use data file without header?
    Your replies will be much appreciated.
    Thanks,
    Abhishek

    Hi Anjali,
    I can use the standard data manager package from BI to BPC, if the CSV file is available in the BPC Server or in case if I am directly extracting from BW object, InfoObject or InfoCube.
    But, since I will be using Open Hub, the output of this can be in a Database Table or a CSV file, preferably in SAP Application Server. In such cases, how can I use the Database table or CSV file in the standard Data Manager Package?
    Thanks for your reply.
    Abhishek

  • Open Hub Destination question

    Hi,
    We want to be able to extract data from a cube and would like to use Open Hub Destination for it. We also want to extract deltas.
    The question is ..
    Should we be making a single info package for this with delta capability ..
    or 2 separate info packages .. 1 for full mode and another for delta mode.
    Regards
    Vandana

    Hi,
    Yes what you said it is write
    from which cube do u want to retract the daat that cube will be the source
    first create the open hub in RSBO tcode and
    in open hub destination select aht and create the transformation between the table( to where u want to export)
    then create the DTP between the cube to the table
    and directly u can use the delta DTP first all the records will come and later onle delta records will be updated.
    Thansk & Regards,
    sathish

  • Open Hub Destination error

    Hi,
    I am working on Open Hub (first time ofcourse). And performing following steps:-
    Modeling>Open Hub> InfoArea> Create Open Hub Destination> Output to File .csv. field names>DIrectory> activate> Create transformation> activate Transformations>Create DTP..Here I am getting an error that destination Open Hub is not active.
    Can anyone please tell me am i missing any step here?
    Thanks

    Hi...
    Always bear in mind that the OHD, once created, should be saved and activated, before creating any transformations or DTPs for it.
    And each time the OHD is changed, all the objects corresponding to it... (The transformations and the DTPs) should be activated.
    Hope this helps.
    Thanks
    Sree Ramya Lanka

  • DTP Delta request for open hub destination.

    Hi expert,
    As you know, In BW 7.0, open hub destination is used for open hub service, and you can create tranformation/DTP for the destination. and in some case the DTP delta is enabled.
    my questions are
    1) in the context menu of open hub destination, I did not find something like manage, which could list all the DTP request executed for this open hub destination..but for normal infocube or ods, you can find a list.
    2) if the DTP could enable the delta, where can I delete the latest delta request, and have a new load..
    3) I assume the DTP could be excuted with batch integrated with process chain by process type 'DTP', right ? if the destination type is third party tool, can it also be integrated with process chain ?
    4) can give some obvious difference of InfoSpoke and open hub destination in terms of advantage and disadvantage..
    Thanks a lot for the feedback.
    Best Regards,
    Bin

    Hi ,
    Please look at the links below . hope it helps .
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d063282b-937d-2b10-d9ae-9f93a7932403?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9?QuickLink=index&overridelayout=true
    Regards
    Santosh

  • To view the data in open hub destination

    Hi all,
    Can anyone help me out how to check the data in open hub destination file.
    These are the details:
    Destination: File
    Type of file : Logical file
    Application Server file name : XXXXXXX
    If it was file name, i know it has to be checked in AL11 with directory name and file name.
    Here its a logical file name. Which is the directory corresponding to this?
    Thanks,
    tinkugeo

    Hi,
    Question: "Are all the logical files are saved in the home directory DIR_HOME?
                      Please correct me if I am wrong"
    Ans: When you create an Open hub destination using Physical file, the file will get created in DIR_HOME directory (by default) in AL11
    And for locating the physical path in AL11. Go through the below link. it gives ans to all your ques
    http://bx.businessweek.com/sap/view?url=http%3A%2F%2Fweblogs.sdn.sap.com%2Fpub%2Fwlg%2F17370
    Edited by: kumkum basu on May 20, 2010 8:20 AM

  • Open Hub Destination transports

    Hi Experts..
    We have crated Open Hub destinations to pull data from ODS to DB table in BI 7.0. But while transport the transformations for OHD are failing with transformation becoming inactive with error messageu201D Target ODSO ZODS*** does not exist" But the ODS is present in QA system and is in active version.
    So can any one pls help me to know the proper sequence for Open Hub Destination transport and the necessary objects to be captured in the request? Currently I had captured OHD, transformation and DTP in the same request.
    <removed by moderator>
    Regards,
    Umesh.
    Edited by: Siegfried Szameitat on Jan 16, 2009 11:57 AM

    Send the OHD in one request, transformations in one request, then DTP in another request.
    When club transformations & DTP together it fails in my experience.

Maybe you are looking for