Open Hub Destination with Flat File - Structure file is not correct

Using the new concept for Open Hub destination (NOT an InfoSpoke).
Output is to a Flat File. Remember that when output is to a flat file 2 files are generated: the output file and a structure file (S_<name of output file>.CSV)
The Output file structure corresponds to the structure defined in the Open Hub destination.
However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
I have not found any settings that I can modify for the S_ file. When using InfoSpokes the S_ file is correct and I do not have to do anything.
The issue is that the S_ file cannot be used to describe the output file. This is a problem.
Anyone knows a solution to this issue? The goal is to get the S_ file to reflect the proper structure.
I could not find any Notes on this. Please do not send general help links, this is a specific question.
Thank you all for your support.

Hi,
However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
I would suggest you to control it again.
Derya

Similar Messages

  • Open hub destination to download the output file into R/3

    Hi Guys,
                 I have a requirement where i need to down load the data from my BW cube to R/3.For this i am using Open hub destination.
    While creating the open hub destination we have option called RFC Destination(in that i have selected destination as my R/3 system).When i execute my DTP it is saying the load is successful and it is showing the no.of records transfered.
    But i am not able to find where this data is getting transfered From BW cube to R/3 system using open hub.

    Hi Friends,
    I have one Query on WDJ. Here I have two requirements.
    I have done click on u201CExport Excelu201D Button that data was download into Excel file Now my requirement is click on Save Button that file will save into (/exchange/CED) this path of R/3 System.
    CED -
    >is R/3 System ID
    Under System id we can save this file.
    Here EP Server and R/3 Servers in Diff Systems.
    I was write codeing
    InputStream text =      null;
         int temp = 0;
         try
         File file =      new File(wdContext.currentContextElement().getResource().getResourceName().toString());
    //       String File = "abc.xls";
    //       String file = "anat" + ".xls";
         wdComponentAPI.getMessageManager().reportSuccess("File::"+file);
         FileOutputStream op =      new FileOutputStream(file);
    //       FileOutputStream op = new FileOutputStream("D://usr//sap//BPE//"+file);
    //       FileOutputStream op = new FileOutputStream("D://KumarDev-BackUp//"+file);
         wdComponentAPI.getMessageManager().reportSuccess("op::"+op);
         if (wdContext.currentContextElement().getResource()!= null)
         text = wdContext.currentContextElement().getResource().read(false);
         while((temp=text.read())!= -1)
         op.write(temp);
         op.flush();
         op.close();
    //        path = file.getAbsolutePath();
         path =      "/exchange/CED/"+file;
         wdComponentAPI.getMessageManager().reportSuccess("path:"+path);
         catch(Exception ex)
              wdComponentAPI.getMessageManager().reportSuccess("ex.printStackTrace()");
    My Req is that file will saved in this path (path =      "/exchange/CED/"+file; )
    Regards
    Vijay Kalluri

  • Open Hub Destination with Application Server Issue

    Hi all,
    I have requriment that i need to save my monthly data in application server with the help of OPen Hub Destination.I have created an DSO
    on that DSO i have created an Open HUb with application server as target.When i store my every month file in application server that file name should be a per the system date.
    So i have Created an LOgical File----ztest_data
    Assgined Physical path to Logical File-----/usr/sap/<sysid>........
    created an Logical File name------Ztest_data_file....
    IN the open HUb i have selected Aplication Server as My target and i have checked the box.
    I have selected Logical FIle name and given the file name which i have created-----ztest_data_file...
    Runned DTP it is showing no errors,but when i check in AL11 ,in the path which i have given(/usr/sap/<sysyid>........,i couldnt find the file.....
    Please check this and let me know...
    Thanks In Advance,
    Bobby.

    Hi,
    Please check the link /people/jyothi.velpula/blog/2010/01/20/creating-open-hub-destination-using-a-logical-file-to-extract-the-data .
    Also is your DTP full or delta. Try running the DTP as Full.
    Hope it helps.
    Best Regards,
    Kush Kashyap

  • Open hub destination with datasource as the source

    Hi,
    I am trying to use the new features of BI for open hub destination where you can use datasource as the source for extraction.
    I have created an infopackage, loaded data to PSA and from there to open hub destination via the DTP.
    When I try to execute the DTP, I get the following error:
    <b>Datapackage processing terminated.
    Error in substep
    You are not authorized to use this transformation.</b>
    I have been able to use the DTP to load open hub destination targets with datastore and infocubes as source.
    Also when we turn on the security trace, it doesn't fail anywhere.
    Has anybody faced a similar problem?
    Any help appreciated,
    Thanks,
    Payal.

    I have created to OHD - one for a 3.5 data source and one for a 7.0 data source (both are different data sources). While creating DTP for the 7.0 data source, it did create a default mapping for transformation.
    But I still get same error in both cases.
    When I create the DTP, it tries to create a transformation. But I get an information message :
    Exception CX_RS_MSG occurred (program: CL_RSBK_PATH==================CP, include: CL_RSBK_PATH==================CM00I, line: 16).
    I can still activate the DTP and when I run it says I don't have authorization to this transformation.
    I think there are issues while creating the transformation, but I don't know what!
    -Payal.

  • Open Hub Destination - csv files

    Hi all,
    I nedd to send information through Open Hub Destination from a cube to a File in a server. But the Open Hub Destination is originating 2 csv files instead of 1: one of the files has just the name of the fields that will be in the columns and the other file only the data in the columns.
    There is an alternative to generate through Open Hub Destination, only one file, with the headers in the columns?
    Can you explain to me how to do it?
    There are other options to generate csv files, automaticaly from cubes,and send them to other applications?
    Thanks a lot.
    Regards.
    Elisabete Duarte

    Hi,
    To send information from a cube to applicatin server thru OHS,
    two files will be formed while using OHS. One which will be saved inthe aplication server and for other u should give the path. For downloadin the file to the applc server, in the  destination tab, select applcn server option and transform it
    Hopeit helps
    Cheers
    Raj
    Message was edited by:
            Raj

  • Open Hub Destination-Third Party Tool

    dear all,
    can any one guide me as to how data(master data in particular) from bw can be transferred to oracle data base using open hub destiantion.is rfc connection between bw and third part tool enough,do we have to maintain the same structure in third party tool before executing DTP and how can this be done??
    i am now able to transfer data from bw into a flat file and also to data base (bw) and not aware of the this functionality-third party tool .any light thrown on this area is welcome and points will be awarded ,
    thanks ,
    sasidhar gunturu

    Some light at the end of tunnel for sure...
    Extraction to the third-party tool can be executed as follows:
           1.      You define an open hub destination with Third-Party Tool as the destination type.
           2.      You create an RFC destination for your third-party tool and enter it in the definition of the open hub destination.
           3.      You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction.
           4.      You either start extraction immediately or include it in a process chain. You can also start this process chain from the third-party tool using process chain API RSPC_API_CHAIN_START. The extraction process then writes the data to a database table in the BI system.
           5.      When the extraction process is finished, the system sends a notification to the third-party tool via API RSB_API_OHS_3RDPARTY_NOTIFY.
           6.      The extracted data is read by API RSB_API_OHS_DEST_READ_DATA.
           7.      The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.
    For more details click below -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/7a69d9f3897103e10000000a1553f7/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Third Party Tool-Open Hub Destination

    dear all,
              can any one guide me as to how data(master data in particular)  from bw can be transferred to oracle data base using open hub destiantion.is rfc connection between bw and third part tool enough,do we have to maintain the same structure in third party tool before executing DTP  and how can this be done??
              i am now able to transfer data from bw into a flat file and also to data base (bw)  and not aware of the this functionality-third party tool .any light thrown on this area is welcome  and points will be awarded ,
    thanks ,
    sasidhar gunturu

    Some light at the end of tunnel for sure...
    Extraction to the third-party tool can be executed as follows:
           1.      You define an open hub destination with Third-Party Tool as the destination type.
           2.      You create an RFC destination for your third-party tool and enter it in the definition of the open hub destination.
           3.      You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction.
           4.      You either start extraction immediately or include it in a process chain. You can also start this process chain from the third-party tool using process chain API RSPC_API_CHAIN_START. The extraction process then writes the data to a database table in the BI system.
           5.      When the extraction process is finished, the system sends a notification to the third-party tool via API RSB_API_OHS_3RDPARTY_NOTIFY.
           6.      The extracted data is read by API RSB_API_OHS_DEST_READ_DATA.
           7.      The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.
    For more details click below -
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/7a69d9f3897103e10000000a1553f7/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Error in Open Hub Destination Transport

    Hi All,
    I am facing the following problem in transporting the openhub destination.
    We have created a Open Hub Destination with following settings.
    Destination Type :- 'FILE'
    We have ticked the application Server name checkbox.
    In the server name we have selected our BI Development Server.
    In the type of File Name we have selected 'Logical File Name'. We have maintained the details in AL11 and FILE transactions.
    While transporting it to our Quality system, it throws an error saying it does not recognize the server name (which is true, the server name is different in quality system).
    Needed your guidance if there is any setting like source system conversion where it would change the name?
    Also if you could guide me regarding the best practices to transport the entries we have created in AL11 and FILE it would be realy nice. I did a search on this in SDN and could not find any useful stuff for my case.
    Thanks in advance.

    Hi,
    Your TR is failing, may be because in your target system, application server file system is different.
    go to transaction AL11 and check that. If same folder structure is not available, get it created, or go to RSA1 and try changing your infospoke settings to point to the application sever file as in your target system.
    You have to point this path to Q server.
    ~AK

  • Need a code for open hub destination

    Hi Experts,
    I have a requirement on open hub destination, as we are working on FI datasources,
    while using open hub destination, it will create two files. one is structure file and another is header file,
    nw my client is asking us to combine these both files into a same file while using open hub destination..
    could you please explain me hw to achieve this situation
    plz share the CODE ...
    Thanks in advance !!!!
    Regards
    sinu reddy

    ghhyh

  • Open hub Destinations - Third-Party Tools As Destinations

    Dear all, below is the SAP Help process flow steps. I am quiet clear of all the steps except the APIs. Where do we define these APIs? When it says "You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction" where do I use it? Is it in the 3rd party tool? or is it in SAP BI.
    If it is in 3rd party tool, where in 3rd party tool. Do I need to write as code ? or would I have an option in the 3rd party tool?
    If someone can help me with Microsoft connector using Microsoft SQL Server 2008 Integration Services packages to SAP BI that would be helpful.
    Where in Microsoft SQL Server 2008 Integration Services packages can I use/put this APIs? Any help Plzz
    If it is in SAP BI where in SAP BI do I have to use these APIs? Any help is highly appreciated
    Regards,
    KK
    Process Flow:
    Extraction to the third-party tool can be executed as follows:
           1.      You define an open hub destination with Third-Party Tool as the destination type.
           2.      You create an RFC destination for your third-party tool and enter it in the definition of the open hub destination.
           3.      You use API RSB_API_OHS_DEST_SETPARAMS to define the parameters for the third-party tool that are required for the extraction.
           4.      You either start extraction immediately or include it in a process chain. You can also start this process chain from the third-party tool using process chain API RSPC_API_CHAIN_START. The extraction process then writes the data to a database table in the BI system.
           5.      When the extraction process is finished, the system sends a notification to the third-party tool using API RSB_API_OHS_3RDPARTY_NOTIFY.
           6.      The extracted data is read by API RSB_API_OHS_DEST_READ_DATA.
           7.      The status of the extraction is transferred to the monitor by API RSB_API_OHS_REQUEST_SETSTATUS.

    It's been a along time since i worked with open hub, but here goes:
    The Third Party tool needs parameters or switches to work correctly. Think of the old DOS commanda the  "/a" "/b" suffix.
    The APIs are simply a way to pass these parameters from the BW source to the Third Party target.
    If your tool does not need them, don't worry about them.
    John Hawk

  • Open Hub Destination over a Datasource as a source

    Hi Expert,
                    I am trying to create Open hub destination over Datasource as a source
    it showing me an error "Cannot connect DataSource to an open hub destination".
    Can anyone tell me is it possible to create an open hub destination over a datasource
    If yes then how?
    Thanks and Regards
    Lalit Kumar

    HI lalit,
    Check the below forum for refer they have refer using datasource as source for OHD m nt sure:
    Open hub destination with datasource as the source
    OHD ISSUE
    Thanks,
    Deepak
    Edited by: Deepak Machal on Dec 15, 2011 11:04 AM

  • Open Hub Destination - From BI7 to R/3 error

    Hi All ,
    I am in the process of transferring infocube data from BI 7.0 to R/3.
    I created an open hub destination with third party . Once I execute the DTP , i get an
    error message
    CREATE DATA : The specified type /BIC/OH**** id no valid data type.
    The RFC connection between both the systems is in place .
    Please guide me where i am going wrong.
    Any help is highly appreciated.
    Thanks & Regards,
    Anjna Goyal

    Hi,
    Set the correct parameters in case of other Systesm i.e. external system, see the below articles on Open Hub Destination
    Open Hub Destination: Part 1
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%201.pdf
    Open Hub Destination: Part 2
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%202.pdf
    Thanks
    Reddy

  • Open Hub Destination and infospoke

    Hi,
    What is the difference between an Open Hub Destination and an infospoke ?
    Thanks,
    Satya

    Hi,
    Infospoke s replaced by Open hub destination with NW2004s release..
    might worth taking look...
    <http://help.sap.com/saphelp_nw2004s/helpdata/en/43/58e1cdbed430d9e10000000a11466f/content.htm>
    <http://help.sap.com/saphelp_nw2004s/helpdata/en/43/79f902dfb06fc9e10000000a1553f6/content.htm>

  • Help~open hub destination file is not open error!

    Hi all,
    I met a problem, i tried several times and all got error messages.
    It is about open hub destination, i want to use it to send some files to sever, but when i execute the DTP, i got system dump:
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Runtime Errors         DATASET_NOT_OPEN
    Except.                CX_SY_FILE_OPEN_MODE
    Date and Time          26.10.2011 01:18:09
    Short text
        File "I:\usr\sap\Q26\DVEBMGS11\work\ZSS_O0021.CSV" is not open.
    What happened?
        Error in the ABAP Application Program
        The current ABAP program "CL_RSB_FILE_APPLSRV===========CP" had to
         terminated because it has
        come across a statement that unfortunately cannot be executed.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    I have two open hub, one is fine ,but the rest one always get this error. Help..........

    Thanks! I checked my ID and found that i have the role to access this folder.
    Then i found that our Q system have two servers, A and B, for open hub i should run my process chain in system B, but i forgot that, so i got error. After i changed to sever B, i re-active my open hub and DTP, then re-fresh the process chain and re-executed it again, the error disappeared.

  • Open Hub Destination Table Vs File

    Hello Experts,
    We have a requirement to send data from BW to 3rd party system every day. In the Open hub destination we have two options for the target.
    1. Table
    2. File
    ETL tool like Informatica will extract the data from table or file into 3rd party sytem.
    Which way is the best in performance and maintenance?
    Thanks in advance
    Sree

    Hi,
    If you follow the delta mechanism then table is good. You can also go for File not a problem. See the both eamples in the following articles.
    If you use Fiels then go for Application Server option, i.e. AL11 becasue no one can change teh files in this location, you can restrict the authorizations.
    Open Hub Destination: Part 1
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%201.pdf
    Open Hub Destination: Part 2
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%202.pdf
    Thanks
    Reddy

Maybe you are looking for