Open Hub Destination problem

Hi
I have created open hub destination(OHD) on a cube and then getting file in CSV format on application server.
I have to move this file to BPC but problem is wen run DTP in OHD it generated 2 files one with header and other with data
How can i make it to generate one file with header and then data in it???
How can i send this file automatically to BPC server( BPC is Microsoft version not netweaver one).

Hi Sadik
BPC is very new in market and there is not much help and yes it has base on MS SQL but thing is all data load is via CSV files only not thru direct table and stuff.
I have tried making DB connect but i m not able to make RFC connection and there is no basis support I have in my project.
I have solved my problem but there is a small stuck up. I am downloading data from internal table into file but this code is dumping file into my local machine. I want the file to be directly dumped into the application server.
Following is the code:
CALL FUNCTION 'MS_EXCEL_OLE_STANDARD_DAT'
EXPORTING
  file_name = 'C:\Test' ????????? [This is the main problem]
*file_name = 'DIR_HOME  E:\usr\sap\BDV\DVEBMGS30\work\ZBPC_1000' "set this for application server location
CREATE_PIVOT = 0
DATA_SHEET_NAME = ' '
PIVOT_SHEET_NAME = ' '
PASSWORD = ' '
PASSWORD_OPTION = 0
TABLES
PIVOT_FIELD_TAB =
data_tab = lt_data_f "internal table with data
fieldnames = lt_header "internal table with header
EXCEPTIONS
file_not_exist = 1
filename_expected = 2
communication_error = 3
ole_object_method_error = 4
ole_object_property_error = 5
invalid_filename = 6
invalid_pivot_fields = 7
download_problem = 8
OTHERS = 9
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
Edited by: Navneet Thakur on Jul 2, 2009 2:53 PM

Similar Messages

  • Open hub destination problem - .csv file is empty

    Hi everyone,
    I am trying to extract data from our BW system using Open Hub Dest. to a .csv file
    DTP runs successfully means show total no. of records but when I see in AL11 then records are not in .csv file that is .csv file is empty.
    How to figure out the problem?
    Best regards
    Ahmad

    @Jagadeesh:
    if I correctly understood you, you mean to say that first I should run DTP and save to an excel file to see if it shows data or not. This i think is not possible; as when we create OHD then 'Destination Type' we can choose among 'DB table', 'table' and 'third party tool',
    so if I choose 'Table' then 'File Name' is .csv file (same name as the OHD) and can change it to .xls file
    @Arun:
    have permission to write in the folder
    Do you mean creating OH to a Database Table?
    in DTP I have a filter on a Characteristic, when its 1500 records then it works (data is in .csv file) but when 3500 records then .csv file is empty
    no error message, no log
    Regards
    Ahmad

  • Problem in transporting Open Hub destination

    Hi,
    I have a open hub destination. The destination is a database table. When i try to  transport the open hub destination to staging, the transport fails saying "Unable to activate table /BIC/OHXXX" . What could be the reason ?
    Thanks,
    Satya

    Hi Satya,
    I have exactly the same problem. The reason why the table could not be activated is because the referenced fields of the open hub table are wrong. Unfortunately I do not know at the moment why this is done. It would be great if could post the solution if you solved the problem. I will keep you informed...
    Thanks,
    Thomas

  • Open hub destination with datasource as the source

    Hi,
    I am trying to use the new features of BI for open hub destination where you can use datasource as the source for extraction.
    I have created an infopackage, loaded data to PSA and from there to open hub destination via the DTP.
    When I try to execute the DTP, I get the following error:
    <b>Datapackage processing terminated.
    Error in substep
    You are not authorized to use this transformation.</b>
    I have been able to use the DTP to load open hub destination targets with datastore and infocubes as source.
    Also when we turn on the security trace, it doesn't fail anywhere.
    Has anybody faced a similar problem?
    Any help appreciated,
    Thanks,
    Payal.

    I have created to OHD - one for a 3.5 data source and one for a 7.0 data source (both are different data sources). While creating DTP for the 7.0 data source, it did create a default mapping for transformation.
    But I still get same error in both cases.
    When I create the DTP, it tries to create a transformation. But I get an information message :
    Exception CX_RS_MSG occurred (program: CL_RSBK_PATH==================CP, include: CL_RSBK_PATH==================CM00I, line: 16).
    I can still activate the DTP and when I run it says I don't have authorization to this transformation.
    I think there are issues while creating the transformation, but I don't know what!
    -Payal.

  • Error in Open Hub Destination Transport

    Hi All,
    I am facing the following problem in transporting the openhub destination.
    We have created a Open Hub Destination with following settings.
    Destination Type :- 'FILE'
    We have ticked the application Server name checkbox.
    In the server name we have selected our BI Development Server.
    In the type of File Name we have selected 'Logical File Name'. We have maintained the details in AL11 and FILE transactions.
    While transporting it to our Quality system, it throws an error saying it does not recognize the server name (which is true, the server name is different in quality system).
    Needed your guidance if there is any setting like source system conversion where it would change the name?
    Also if you could guide me regarding the best practices to transport the entries we have created in AL11 and FILE it would be realy nice. I did a search on this in SDN and could not find any useful stuff for my case.
    Thanks in advance.

    Hi,
    Your TR is failing, may be because in your target system, application server file system is different.
    go to transaction AL11 and check that. If same folder structure is not available, get it created, or go to RSA1 and try changing your infospoke settings to point to the application sever file as in your target system.
    You have to point this path to Q server.
    ~AK

  • Erase data from open hub destination data service

    Hi,
    I'm getting an error trying to delete data from a cube, for which data are extracted using open hub service and loaded into a table.
    The error given is the following:
    Request updated in 1794 already by 1795 target DTPR_4EEX9CGGXLTU2QIPWUAIEUDZK DTP request (ZRAPPEL)
    Message no. RSM077
    Diagnosis
    One or more DTP requests have already updated the request.
    Procedure
    Firstly, delete all update requests from your targets. Then you can delete the source request.
    Zrappel data is the open hub destination. But it is clear where requests for this destination, to let me delete the cube data.
    Thanks.

    Hi,
    Procedure
    Firstly, delete all update requests from your targets. Then you can delete the source request.
    Follow the same steps above , that means delete the requests from the Cube Zrappel.
    right click on the cube -
    > Delete Data
    now the syatem will promt you asking
    Do you want to delete the content of data target Zrapple ?
    say yes for it .
    hope this will solve your problem
    santosh

  • Help~open hub destination file is not open error!

    Hi all,
    I met a problem, i tried several times and all got error messages.
    It is about open hub destination, i want to use it to send some files to sever, but when i execute the DTP, i got system dump:
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Runtime Errors         DATASET_NOT_OPEN
    Except.                CX_SY_FILE_OPEN_MODE
    Date and Time          26.10.2011 01:18:09
    Short text
        File "I:\usr\sap\Q26\DVEBMGS11\work\ZSS_O0021.CSV" is not open.
    What happened?
        Error in the ABAP Application Program
        The current ABAP program "CL_RSB_FILE_APPLSRV===========CP" had to
         terminated because it has
        come across a statement that unfortunately cannot be executed.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    I have two open hub, one is fine ,but the rest one always get this error. Help..........

    Thanks! I checked my ID and found that i have the role to access this folder.
    Then i found that our Q system have two servers, A and B, for open hub i should run my process chain in system B, but i forgot that, so i got error. After i changed to sever B, i re-active my open hub and DTP, then re-fresh the process chain and re-executed it again, the error disappeared.

  • Open hub destination of text and attribute for Functional Area

    Hi Expert
    I have a requirement to accommodate both attribute ( 0FUNCT_LOC_ATTR)  and text ( 0FUNCT_LOC_TEXT ) for functional area in open hub destination .For this a DSO has been created from both 0FUNCT_LOC_ATTR, 0FUNCT_LOC_TEXT data sources .Problem is that DSO has 0FUNCT_AREA ( Functional Area ) as only key fields . So text for both english ( EN ) and Dutch ( NL) for particular 0FUNCT_AREA not getting loaded .If we assign both 0LANGU and 0FUNCT_AREA as composite key of dso , data from attribute transformation is loaded as blank as there is no language filed in attribute data source ( 0FUNCT_LOC_ATTR ).   Please let me know how can I design to accommodate both text for different language and functional area for open hub destination  .
    I know that I can create separate open hub destination for master data TEXT and Attribute.
    Any help is highly appreciated.
    Regards
    Saikat

    HI Saikat,
    You can create an additional characteristics counter and make it a key field in the DSO.After that we can populate this characteristics in the end routine .It value would increase by 1 everytime  different text comes for the same functional area.
    Sample code
    data: counter type n,
             w_farea type /bio/oifunc_area.
    Sort result_package by func_area.
    count =0.
    *checking whether we have the same functional area or we have a different functional area.
    Loop at result_package assigning <Result_fields>.
    if w_farea is initial or w_farea = <result_fields>-func_area.
    count = count +1.
    else
    count =1.
    endif.
    w_farea= <result_fields>-func_area.
    <Result_fields>-zcount = count.
    ENDLOOP.
    Clear w_farea.
    This way the value of counter would get increased by one everytime a new text comes for the same functional area.We can have any number of texts for the same functional area by this method.
    The only issue would be when some delta update comes for the existing text in this method.If you are doing a full load to the DSO from the datasource, then there would be no issue with this method.

  • Open Hub Destination with Flat File - Structure file is not correct

    Using the new concept for Open Hub destination (NOT an InfoSpoke).
    Output is to a Flat File. Remember that when output is to a flat file 2 files are generated: the output file and a structure file (S_<name of output file>.CSV)
    The Output file structure corresponds to the structure defined in the Open Hub destination.
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    I have not found any settings that I can modify for the S_ file. When using InfoSpokes the S_ file is correct and I do not have to do anything.
    The issue is that the S_ file cannot be used to describe the output file. This is a problem.
    Anyone knows a solution to this issue? The goal is to get the S_ file to reflect the proper structure.
    I could not find any Notes on this. Please do not send general help links, this is a specific question.
    Thank you all for your support.

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • Open Hub Destination Table Vs File

    Hello Experts,
    We have a requirement to send data from BW to 3rd party system every day. In the Open hub destination we have two options for the target.
    1. Table
    2. File
    ETL tool like Informatica will extract the data from table or file into 3rd party sytem.
    Which way is the best in performance and maintenance?
    Thanks in advance
    Sree

    Hi,
    If you follow the delta mechanism then table is good. You can also go for File not a problem. See the both eamples in the following articles.
    If you use Fiels then go for Application Server option, i.e. AL11 becasue no one can change teh files in this location, you can restrict the authorizations.
    Open Hub Destination: Part 1
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%201.pdf
    Open Hub Destination: Part 2
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%202.pdf
    Thanks
    Reddy

  • Delete request in Open Hub Destination

    Hi all,
    I am trying to delete a request in an Open Hub Destination. I have tried report RSBK_DEL_DTP_REQ_FROM_OHD
    but it does not work. I have also tried to set the request status to 'red' within the table RSBREQUID3RD but this also does not work.
    I have no clue how to fix the problem.
    I have to delete the request in case of the underlying DSO which  I have to change and this is not possible if the request is active in any data target.
    Please can someone give me any hint
    Regards
    Peter

    Hi,
    Use SE37, to execute the function module RSBM_GUI_CHANGE_USTATE.From the next screen, for I_REQUID enter that request ID and execute.From the next screen, select 'Status Erroneous' radiobutton and continue.This Function Module, changes the status of request from Green / Yellow to RED.
    So you should now be able to delete the request.
    Regards.

  • Open Hub destinations

    I am using open hub destinations to transfer the cube data into database tables in BI.During activation of open hub the database tables with names /BIC/OH* are created. I am facing problem with the reference fields associated with this table.It is taking the field name itself as the reference field for Quantity and currency fields.Suppose I have a field CRM_NETVAM in table, the reference field shown for that in table is CRM_NETVAM only.This is the problem, so it wo'nt allow us to activate the databse table and ulltimately will be cause of failure of the open hub functionality.
    I need quick help for this and will give full points to anything which can help me out in solving this.

    Hello,
    Open hub service is to distibute data from BI system to non SAP systems. In open hub service for data export from BI system to non SAP system the central object is Infospoke and within infospoke BI Objects acts as open hub data source  from which data is extracted in full and delta mode and flat files and database tables acts as open hub destinations basically the target system into which data is transferred and data can be transferred using Business Add In (BADI)
    Please check this link:
    http://help.sap.com/saphelp_nw04s/helpdata/en/c5/03853c01c89d7ce10000000a11405a/frameset.htm
    Hope it helps.
    Regards,
    Mona

  • Open Hub transporting problems

    Hi everybody, we have Open Hub transporting problems, it just gives us return code 12 or 8, basically, it doen't want to transport neither Open Hub Destination, nor Transformations, nor DTPs, and bear in mind that we're on the 15th Patch Level. We've looked at all the SAP Notes, but they reffer to the earlier patch levels...
    And other problem, is that it's OK in DEV system but in QUALITY system, when you launch a DTP it gives us a dumb with ASSIGN_TYPE_ERROR
    Any ideas??

    My destination path on the DEV box is simply the following (identified by BID in the following path):
          D:\usr\sap\BID\DVEBMGS00\work
    And when I locate the un-activated open hub on the QA box, the BID is already changed to BIQ, but the server name box/field is empty....
    I am going to try with the object changebility option, btw, is there any documentation on how to change these objects that are marked with object changebility on the QA box (provided that we can not manually change anything on the QA box)...
    Anyways, I think my problem is with the server name conversion from "abc-bi-dev" to "abc-bi-qa"....!!!
    Also, please note the RSBFILE entry for this object in the QA box has the "btcsrvname" as blank...
    Do you think there is any additional setting that needs to be done on the QA box to do this translation (from "abc-bi-dev" to "abc-bi-qa")?
    The only two setup we have done as far as the BI is concerned is at the "Tools-> Conversion of logical system name"....where you only put the conversion for client name mapping from dev to qa and local file system mapping from dev to qa.
    So, is there anything else to map the host name (from "abc-bi-dev" to "abc-bi-qa") either at BI perspective or Basis/NetWeaver perspective?
    Thanks a lot...
    ~Sabuj.
    Edited by: Sabuj Haque on Feb 1, 2009 1:11 PM
    Edited by: Sabuj Haque on Feb 1, 2009 5:34 PM

  • Use of Open Hub Destination to load data into BPC

    Hi Gurus,
    I want to load the data from SAP BI to BPC (NW version), using Open Hub Destination. I want to know few things about this.
    1) What method should I use? Should I use Destination as Flat Files or Database tables? If Database tables, how can I use this data in the tables in BPC?
    2) If I go for Flat Files, which  is saved in the application server of BI, how can I import those files to BPC and use it using Data Manager?
    3) Also, in case of Flat Files, there are two files which are created. One is the control file and other one is the data file. How can I use both of them? Or can I just use data file without header?
    Your replies will be much appreciated.
    Thanks,
    Abhishek

    Hi Anjali,
    I can use the standard data manager package from BI to BPC, if the CSV file is available in the BPC Server or in case if I am directly extracting from BW object, InfoObject or InfoCube.
    But, since I will be using Open Hub, the output of this can be in a Database Table or a CSV file, preferably in SAP Application Server. In such cases, how can I use the Database table or CSV file in the standard Data Manager Package?
    Thanks for your reply.
    Abhishek

  • Open Hub Destination question

    Hi,
    We want to be able to extract data from a cube and would like to use Open Hub Destination for it. We also want to extract deltas.
    The question is ..
    Should we be making a single info package for this with delta capability ..
    or 2 separate info packages .. 1 for full mode and another for delta mode.
    Regards
    Vandana

    Hi,
    Yes what you said it is write
    from which cube do u want to retract the daat that cube will be the source
    first create the open hub in RSBO tcode and
    in open hub destination select aht and create the transformation between the table( to where u want to export)
    then create the DTP between the cube to the table
    and directly u can use the delta DTP first all the records will come and later onle delta records will be updated.
    Thansk & Regards,
    sathish

Maybe you are looking for

  • "Failed to load the NVIDIA kernel module" [SOLVED]

    Hi, I've just done a fresh install of Arch64 and I'm having problems with NVIDIA driver from the repo. xorg, xf86-input-evdev -- INSTALLED nvidia, nvidia-utils, lib32-nvidia-utils -- INSTALLED (version 256.44-1) nouveau -- NOT INSTALLED here goes the

  • Keypress listener anywhere in a JFrame

    Hello! I want to listen for a keypress (F12) from anywhere in my JFrame window, exactly like if I added a menubar with an accelerator. I can't just register a KeyListener on the JFrame because I have serveral panels on it. Any ideas? //John

  • Running 64b apps from chrooted 32b (e.g. ktorrent64 from firefox32)

    I run arch64 with chroot for some 32 bit applications (did like the wiki http://wiki.archlinux.org/index.php/Arc … bit_system). I'm new to arch and to 64 bit architecture (just two days!). I run firefox @ 32bit for flash and java plugins. When I try

  • App Store is missing in Snow Leopard.

    I am running OS X 10.6.8 and the App Store is missing. I have tried multiple reinstalls but it does not help. I want to upgrade to Mountain Lion but can't do without the App Store. I have contacted support and they are of no help. Any ideas would be

  • Using Set Credentials

    Until recently, I have been connecting my clients to EJB and receiving a single reference object using Flash Remoting. Now, I want to connect each client with its own EJB reference through the NetServices gateway. How should I set the credentials to