Open hub destination with datasource as the source

Hi,
I am trying to use the new features of BI for open hub destination where you can use datasource as the source for extraction.
I have created an infopackage, loaded data to PSA and from there to open hub destination via the DTP.
When I try to execute the DTP, I get the following error:
<b>Datapackage processing terminated.
Error in substep
You are not authorized to use this transformation.</b>
I have been able to use the DTP to load open hub destination targets with datastore and infocubes as source.
Also when we turn on the security trace, it doesn't fail anywhere.
Has anybody faced a similar problem?
Any help appreciated,
Thanks,
Payal.

I have created to OHD - one for a 3.5 data source and one for a 7.0 data source (both are different data sources). While creating DTP for the 7.0 data source, it did create a default mapping for transformation.
But I still get same error in both cases.
When I create the DTP, it tries to create a transformation. But I get an information message :
Exception CX_RS_MSG occurred (program: CL_RSBK_PATH==================CP, include: CL_RSBK_PATH==================CM00I, line: 16).
I can still activate the DTP and when I run it says I don't have authorization to this transformation.
I think there are issues while creating the transformation, but I don't know what!
-Payal.

Similar Messages

  • Open Hub Destination with Application Server Issue

    Hi all,
    I have requriment that i need to save my monthly data in application server with the help of OPen Hub Destination.I have created an DSO
    on that DSO i have created an Open HUb with application server as target.When i store my every month file in application server that file name should be a per the system date.
    So i have Created an LOgical File----ztest_data
    Assgined Physical path to Logical File-----/usr/sap/<sysid>........
    created an Logical File name------Ztest_data_file....
    IN the open HUb i have selected Aplication Server as My target and i have checked the box.
    I have selected Logical FIle name and given the file name which i have created-----ztest_data_file...
    Runned DTP it is showing no errors,but when i check in AL11 ,in the path which i have given(/usr/sap/<sysyid>........,i couldnt find the file.....
    Please check this and let me know...
    Thanks In Advance,
    Bobby.

    Hi,
    Please check the link /people/jyothi.velpula/blog/2010/01/20/creating-open-hub-destination-using-a-logical-file-to-extract-the-data .
    Also is your DTP full or delta. Try running the DTP as Full.
    Hope it helps.
    Best Regards,
    Kush Kashyap

  • Open Hub Destination with Flat File - Structure file is not correct

    Using the new concept for Open Hub destination (NOT an InfoSpoke).
    Output is to a Flat File. Remember that when output is to a flat file 2 files are generated: the output file and a structure file (S_<name of output file>.CSV)
    The Output file structure corresponds to the structure defined in the Open Hub destination.
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    I have not found any settings that I can modify for the S_ file. When using InfoSpokes the S_ file is correct and I do not have to do anything.
    The issue is that the S_ file cannot be used to describe the output file. This is a problem.
    Anyone knows a solution to this issue? The goal is to get the S_ file to reflect the proper structure.
    I could not find any Notes on this. Please do not send general help links, this is a specific question.
    Thank you all for your support.

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • Open Hub Destination over a Datasource as a source

    Hi Expert,
                    I am trying to create Open hub destination over Datasource as a source
    it showing me an error "Cannot connect DataSource to an open hub destination".
    Can anyone tell me is it possible to create an open hub destination over a datasource
    If yes then how?
    Thanks and Regards
    Lalit Kumar

    HI lalit,
    Check the below forum for refer they have refer using datasource as source for OHD m nt sure:
    Open hub destination with datasource as the source
    OHD ISSUE
    Thanks,
    Deepak
    Edited by: Deepak Machal on Dec 15, 2011 11:04 AM

  • To view the data in open hub destination

    Hi all,
    Can anyone help me out how to check the data in open hub destination file.
    These are the details:
    Destination: File
    Type of file : Logical file
    Application Server file name : XXXXXXX
    If it was file name, i know it has to be checked in AL11 with directory name and file name.
    Here its a logical file name. Which is the directory corresponding to this?
    Thanks,
    tinkugeo

    Hi,
    Question: "Are all the logical files are saved in the home directory DIR_HOME?
                      Please correct me if I am wrong"
    Ans: When you create an Open hub destination using Physical file, the file will get created in DIR_HOME directory (by default) in AL11
    And for locating the physical path in AL11. Go through the below link. it gives ans to all your ques
    http://bx.businessweek.com/sap/view?url=http%3A%2F%2Fweblogs.sdn.sap.com%2Fpub%2Fwlg%2F17370
    Edited by: kumkum basu on May 20, 2010 8:20 AM

  • Populate text description in the open hub destination

    Hi SDNers,
    I have requirement with open hub destination, i am taking the cube data to SQL server database.
    in that scenario i need text description in the cube ,but we can not see the text description data in the cube(because cube can hold dimension data and utmost navigational attributes data but no textual description).
    Text data lies out side the cube but i need couples of text description for master data texts.
    Do i need to write the code to populate(look-up) the text description from text table.
    Can any one suggest an idea.
    Thanks,
    Satya

    Yes, you may want to write a code on transformations from Cube to Open hub to go and get text from Text table of that Info object.
    Make sure to create a new IO of lenght equal to Text field in IO and add this new IO in the Open hub and in transformations write a code at field  like:
    select single TXTSH from /BIC/Txxxxx into wa_text where
    key_of_p_table = /bic/source_field
    and langu = 'EN'
    If sy-subrc = 0.
    result = wa_text.
    endif.
    clear:wa_text

  • Open hub destination to download the output file into R/3

    Hi Guys,
                 I have a requirement where i need to down load the data from my BW cube to R/3.For this i am using Open hub destination.
    While creating the open hub destination we have option called RFC Destination(in that i have selected destination as my R/3 system).When i execute my DTP it is saying the load is successful and it is showing the no.of records transfered.
    But i am not able to find where this data is getting transfered From BW cube to R/3 system using open hub.

    Hi Friends,
    I have one Query on WDJ. Here I have two requirements.
    I have done click on u201CExport Excelu201D Button that data was download into Excel file Now my requirement is click on Save Button that file will save into (/exchange/CED) this path of R/3 System.
    CED -
    >is R/3 System ID
    Under System id we can save this file.
    Here EP Server and R/3 Servers in Diff Systems.
    I was write codeing
    InputStream text =      null;
         int temp = 0;
         try
         File file =      new File(wdContext.currentContextElement().getResource().getResourceName().toString());
    //       String File = "abc.xls";
    //       String file = "anat" + ".xls";
         wdComponentAPI.getMessageManager().reportSuccess("File::"+file);
         FileOutputStream op =      new FileOutputStream(file);
    //       FileOutputStream op = new FileOutputStream("D://usr//sap//BPE//"+file);
    //       FileOutputStream op = new FileOutputStream("D://KumarDev-BackUp//"+file);
         wdComponentAPI.getMessageManager().reportSuccess("op::"+op);
         if (wdContext.currentContextElement().getResource()!= null)
         text = wdContext.currentContextElement().getResource().read(false);
         while((temp=text.read())!= -1)
         op.write(temp);
         op.flush();
         op.close();
    //        path = file.getAbsolutePath();
         path =      "/exchange/CED/"+file;
         wdComponentAPI.getMessageManager().reportSuccess("path:"+path);
         catch(Exception ex)
              wdComponentAPI.getMessageManager().reportSuccess("ex.printStackTrace()");
    My Req is that file will saved in this path (path =      "/exchange/CED/"+file; )
    Regards
    Vijay Kalluri

  • Error in Open Hub Destination Transport

    Hi All,
    I am facing the following problem in transporting the openhub destination.
    We have created a Open Hub Destination with following settings.
    Destination Type :- 'FILE'
    We have ticked the application Server name checkbox.
    In the server name we have selected our BI Development Server.
    In the type of File Name we have selected 'Logical File Name'. We have maintained the details in AL11 and FILE transactions.
    While transporting it to our Quality system, it throws an error saying it does not recognize the server name (which is true, the server name is different in quality system).
    Needed your guidance if there is any setting like source system conversion where it would change the name?
    Also if you could guide me regarding the best practices to transport the entries we have created in AL11 and FILE it would be realy nice. I did a search on this in SDN and could not find any useful stuff for my case.
    Thanks in advance.

    Hi,
    Your TR is failing, may be because in your target system, application server file system is different.
    go to transaction AL11 and check that. If same folder structure is not available, get it created, or go to RSA1 and try changing your infospoke settings to point to the application sever file as in your target system.
    You have to point this path to Q server.
    ~AK

  • Open hub Destinations - Third-Party Tools As Destinations

    Dear all, below is the SAP Help process flow steps. I am quiet clear of all the steps except the APIs. Where do we define these APIs? When it says "You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction" where do I use it? Is it in the 3rd party tool? or is it in SAP BI.
    If it is in 3rd party tool, where in 3rd party tool. Do I need to write as code ? or would I have an option in the 3rd party tool?
    If someone can help me with Microsoft connector using Microsoft SQL Server 2008 Integration Services packages to SAP BI that would be helpful.
    Where in Microsoft SQL Server 2008 Integration Services packages can I use/put this APIs? Any help Plzz
    If it is in SAP BI where in SAP BI do I have to use these APIs? Any help is highly appreciated
    Regards,
    KK
    Process Flow:
    Extraction to the third-party tool can be executed as follows:
           1.      You define an open hub destination with Third-Party Tool as the destination type.
           2.      You create an RFC destination for your third-party tool and enter it in the definition of the open hub destination.
           3.      You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction.
           4.      You either start extraction immediately or include it in a process chain. You can also start this process chain from the third-party tool using process chain API RSPC_API_CHAIN_START. The extraction process then writes the data to a database table in the BI system.
           5.      When the extraction process is finished, the system sends a notification to the third-party tool using API RSB_API_OHS_3RDPARTY_NOTIFY.
           6.      The extracted data is read by API RSB_API_OHS_DEST_READ_DATA.
           7.      The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.

    It's been a along time since i worked with open hub, but here goes:
    The Third Party tool needs parameters or switches to work correctly. Think of the old DOS commanda the  "/a" "/b" suffix.
    The APIs are simply a way to pass these parameters from the BW source to the Third Party target.
    If your tool does not need them, don't worry about them.
    John Hawk

  • Open Hub Destination-Third Party Tool

    dear all,
    can any one guide me as to how data(master data in particular) from bw can be transferred to oracle data base using open hub destiantion.is rfc connection between bw and third part tool enough,do we have to maintain the same structure in third party tool before executing DTP and how can this be done??
    i am now able to transfer data from bw into a flat file and also to data base (bw) and not aware of the this functionality-third party tool .any light thrown on this area is welcome and points will be awarded ,
    thanks ,
    sasidhar gunturu

    Some light at the end of tunnel for sure...
    Extraction to the third-party tool can be executed as follows:
           1.      You define an open hub destination with Third-Party Tool as the destination type.
           2.      You create an RFC destination for your third-party tool and enter it in the definition of the open hub destination.
           3.      You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction.
           4.      You either start extraction immediately or include it in a process chain. You can also start this process chain from the third-party tool using process chain API RSPC_API_CHAIN_START. The extraction process then writes the data to a database table in the BI system.
           5.      When the extraction process is finished, the system sends a notification to the third-party tool via API RSB_API_OHS_3RDPARTY_NOTIFY.
           6.      The extracted data is read by API RSB_API_OHS_DEST_READ_DATA.
           7.      The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.
    For more details click below -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/7a69d9f3897103e10000000a1553f7/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open Hub Destination - From BI7 to R/3 error

    Hi All ,
    I am in the process of transferring infocube data from BI 7.0 to R/3.
    I created an open hub destination with third party . Once I execute the DTP , i get an
    error message
    CREATE DATA : The specified type /BIC/OH**** id no valid data type.
    The RFC connection between both the systems is in place .
    Please guide me where i am going wrong.
    Any help is highly appreciated.
    Thanks & Regards,
    Anjna Goyal

    Hi,
    Set the correct parameters in case of other Systesm i.e. external system, see the below articles on Open Hub Destination
    Open Hub Destination: Part 1
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%201.pdf
    Open Hub Destination: Part 2
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%202.pdf
    Thanks
    Reddy

  • Open Hub destination in BI 7.0

    I am trying to create an open hub destination to extract data from an mutiprovider to BPC, when I get the step where I need to assign the infocube to the open hub destination, I cannot find the mutiprovider from the drop down menu, is this means that open hub cannot use the mutiprovider as the data source?

    Hi,
    Infocubes, DataStore Objects or Infoobjects(attributes or texts) can be used as open hub data sources.
    Please refer below link:
    http://help.sap.com/saphelp_nw70ehp1/helpdata/en/ce/c2463c6796e61ce10000000a114084/frameset.htm
    Thanks,
    Rema

  • Third Party Tool-Open Hub Destination

    dear all,
              can any one guide me as to how data(master data in particular)  from bw can be transferred to oracle data base using open hub destiantion.is rfc connection between bw and third part tool enough,do we have to maintain the same structure in third party tool before executing DTP  and how can this be done??
              i am now able to transfer data from bw into a flat file and also to data base (bw)  and not aware of the this functionality-third party tool .any light thrown on this area is welcome  and points will be awarded ,
    thanks ,
    sasidhar gunturu

    Some light at the end of tunnel for sure...
    Extraction to the third-party tool can be executed as follows:
           1.      You define an open hub destination with Third-Party Tool as the destination type.
           2.      You create an RFC destination for your third-party tool and enter it in the definition of the open hub destination.
           3.      You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction.
           4.      You either start extraction immediately or include it in a process chain. You can also start this process chain from the third-party tool using process chain API RSPC_API_CHAIN_START. The extraction process then writes the data to a database table in the BI system.
           5.      When the extraction process is finished, the system sends a notification to the third-party tool via API RSB_API_OHS_3RDPARTY_NOTIFY.
           6.      The extracted data is read by API RSB_API_OHS_DEST_READ_DATA.
           7.      The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.
    For more details click below -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/7a69d9f3897103e10000000a1553f7/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open Hub Destination and infospoke

    Hi,
    What is the difference between an Open Hub Destination and an infospoke ?
    Thanks,
    Satya

    Hi,
    Infospoke s replaced by Open hub destination with NW2004s release..
    might worth taking look...
    <http://help.sap.com/saphelp_nw2004s/helpdata/en/43/58e1cdbed430d9e10000000a11466f/content.htm>
    <http://help.sap.com/saphelp_nw2004s/helpdata/en/43/79f902dfb06fc9e10000000a1553f6/content.htm>

  • List of Open Hub Destination

    Hi All,
    Is there a table which list out all the open hub destination?
    Saurabh.

    Hi Saurabh,
    Table for Open Hub Services are :-
    RSBOHDEST : Directory of Open Hub Destinations
    RSBOHFIELDS : Fields of the Open Hub Destination
    Some FM are also available. It might be of help to you.
    RSB_API_OHS_DEST_GETLIST
    RSB_API_OHS_DEST_GETDETAIL
    Roy

Maybe you are looking for

  • How do I set up a shared folder so all user accounts can view photos?

    How do I set up a shared drive for iPhotos for multiple users of my iMac?

  • Installing Windows 7 RC on T400 -- drivers required for setup

    When Windows 7 RC was released, I immediately installed it with no issues. I have since returned to Vista as there was a lack of W7 drivers initially.  Now that this has been overcome with Lenovo releasing beta drivers, I tried installing it again th

  • Error message: libsmime3.dylib is in use and will not allow update

    I tried to download the new firefox 5.0, for Mac OS10 when I put it into the application folder the error message showed up: libsmime3.dylib is in use and would not allow new version to replace old I have restarted the computer several times and stil

  • My ID

    Ok. So since I own the email adress i got the password. But some guy in the UK is using my email for his acount. And i need that changed and am afraid to use payment methods, can i just avoid that step?

  • Performance point dashboard slow in rendering

    Hi, we have some performancepoint dashboard deployed on SharePoint 2010. These dashboard access an olap cube and display some data from cube. Sometimes rendering of dashboard is slow. Analyzing pages using developer dashboard we found that when rende