Open Hub in BW 3.1

Hi,
Can someone please clarify me about open hub steps in SAP BW version 3.1. I no more have access to 3.1 system as of now but has to put my views for a new development strategy. I am not sure whether it had something like InfoSpoke what we use in version 3.5 (I worked in 3.5 too).
Also if you can throw some light on its licensing then it would be great. I am not sure license would be required or not when customer uses open hub to extract data from SAP BW 3.1 InfoProvide.
Ritika Sharma

I need to clarify if any of you have idea about SAP BW 3.1, whether it has InfoSpoke? And will our customer need to have license if data from InfoProviders has to be extracted?
We have to use InfoSpoke as a one time activity to extract data from SAP BW InfoProviders.
Any clarification on the same would be appreciated.
Ritika Sharma

Similar Messages

  • Error while assigning constant to infoobject in open hub transformation

    While assigning constants(in rule details) to infoobject in a transformation in a open hub I am getting an error "The Object Name is not allowed to be empty"
    Can anyone tell my why this is happening? What should I do now?

    Hi,
    In the transformation, have you connected that rule to any target field of your destination? If not, then do that and then try to create the routine.
    Regards,
    Vaibhav

  • Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"

    Hi all,
    We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded. 
    I understand we could use logical file for this purpose.  However we are not sure how to have the period and year to be dynamically derived in filename.
    I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename".  However i could not seem to be able to get document from the link given. 
    Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
    Many thanks and best regards,
    Victoria

    Hi,
    After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths'  then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of  physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
    however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
    hope this helps u to some extent.
    Regards

  • UNCAUGHT EXCEPTION while executing Open Hub

    Hi All,
    I have created Open hub destination in BI 7.0 on one custom cube. I am trying to create file on application server using this open hub. I have also created logical filename etc..
    When I ran the DTP for this open hub, I am getting following error:
    Runtime Errors UNCAUGHT_EXCEPTION
    Except. CX_RSB_WRITE_ERROR
    This was running fine on DEV and QA system but giving this error in PRD.
    Any clue about this error?
    Thanks in advance.

    Hello,
    it seems the logical path for logical filename is not correct. Could you please check it in t-code FILE?
    Kind Regards,
    Paula Csete

  • Extract Data with OPEN HUB to a Logical Filename

    Hi Experts,
    Can anybody help me in sending the link for How to guide...Extract Data with OPEN HUB to a Logical Filename?
    Thanks in advance.
    BWUser

    Hi,
    check this links...
    http://searchcrm.techtarget.com/generic/0,295582,sid21_gci1224995,00.html
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/e698aa90-0201-0010-7982-b498e02af76b
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1570a990-0201-0010-1280-bcc9c10c99ee
    hope this may help you ..
    Regards,
    shikha

  • Open Hub not extracting all data in CUBE

    We created an FI-SL cube and use the OPEN HUB process to extract the current value of accounts.  The problem is that not all the data is being extracted.
    I have a couple of examples where some of the data in the CUBE is not being extracted by the OPEN HUB process.  I have checked the filter and it is setup properly.  I have manually validated the data in the CUBE. 
    It is very strange that the data is in the CUBE correctly, but just being extracted.
    Any ideas?
    Thanks,
    Bill Lomeli

    Ashok,
    This is the first time I have seen a comment about the field limitation.
    + A previous message from Deepak Simon states:
    "I want to extract data from a cube with more than 16 fields."+
    We are using BI 7.0...is that restriction still in place for this version?
    Thanks,
    Bill

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • How to delete the Generated files from application server(open hub)?

    hi experts,
    when i try to execute process chain the DTP it is giving below dump. Exception CX_RSBK_REQUEST_LOCKED logged.
    when i execute the DTP manually and trying to delete the previous request, it is giving for dump ITAB_DUPLICATE_KEY.
    so to delete the generated file from application server, how to delete it for specific dates?
    Information on where terminated
    Termination occurred in the ABAP program "GPD6S3OE0BCVGC6L9DBNVYQARZM" - in
    "START_ROUTINE".
    The main program was "RSBATCH_EXECUTE_PROZESS ".
    In the source code you have the termination point in line 2874
    of the (Include) program "GPD6S3OE0BCVGC6L9DBNVYQARZM".
    The program "GPD6S3OE0BCVGC6L9DBNVYQARZM" was started as a background job.
    and when i check the dump it is point out at below code
    " Populate the lookup table for 0STOR_LOC
    SELECT * from /BI0/TSTOR_LOC
    into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
    FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
    STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    but the programme is syntactically correct only.
    how to rectify the issue.
    regards
    venuscm
    Edited by: venugopal vadlamudi on Sep 28, 2010 1:59 PM

    hi experts,
    We have written start routine to get the storage location text and sending to File located at Application server through OPEN HUB.
    here is the code written in the Transformations
    In the global section
    Text for 0STOR_LOC
        DATA: l_0stor_loc_text TYPE HASHED TABLE OF /bi0/tstor_loc
              WITH UNIQUE KEY stor_loc.
        DATA: l_0stor_loc_text_wa TYPE /bi0/tstor_loc.
    and in the code to get the text
    " Populate the lookup table for 0STOR_LOC
        *SELECT * from /BI0/TSTOR_LOC*
          into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
          FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
                  STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    im sure there is problem with the Routine only. i think i need to change the code if so please provide me the modified one.
    thanks
    venuscm
    Edited by: venugopal vadlamudi on Sep 29, 2010 9:37 AM

  • Delete data from Cube which is uploaded to Open hub

    Hello Friends I am required to delete the data from GL Cube, But it is connected to 2 Open Hub
    which has one as table /BIC/OHZZIGL_C10 defined in Open hub destination. I deleted the data from this table in SE14, but still when I try to delete the data from cube it throws the below error
    Please help me how to now delete the data from cube, when should i go in open hub t delete all the uploaded requests.
    Request 1.882 already updated in target 3.913 by DTP request DTPR_4CU6JHRJ7MY889A7XBUJ0GDVK(ZZIGL_C10)
    Thanks
    soniya

    just check on your datamart status for that request. Go to the manage screen of your infocube and check where all that request got updated using datamart status.

  • Use of Open Hub Destination to load data into BPC

    Hi Gurus,
    I want to load the data from SAP BI to BPC (NW version), using Open Hub Destination. I want to know few things about this.
    1) What method should I use? Should I use Destination as Flat Files or Database tables? If Database tables, how can I use this data in the tables in BPC?
    2) If I go for Flat Files, which  is saved in the application server of BI, how can I import those files to BPC and use it using Data Manager?
    3) Also, in case of Flat Files, there are two files which are created. One is the control file and other one is the data file. How can I use both of them? Or can I just use data file without header?
    Your replies will be much appreciated.
    Thanks,
    Abhishek

    Hi Anjali,
    I can use the standard data manager package from BI to BPC, if the CSV file is available in the BPC Server or in case if I am directly extracting from BW object, InfoObject or InfoCube.
    But, since I will be using Open Hub, the output of this can be in a Database Table or a CSV file, preferably in SAP Application Server. In such cases, how can I use the Database table or CSV file in the standard Data Manager Package?
    Thanks for your reply.
    Abhishek

  • Can we set key figure in between 2 char in open hub destination.

    can any one tell me how can i insert a key description in between the char in open hub destinatio.
    Exp:my key description:quantity
    and char desc:material and plant.
    now i want field should be,
    Material.
    quantity
    plant
    when i am save the OHD ,i am getting the field like this,
    material
    plant
    quantity
    Can any one help me on this?

    Hi,
    If I am not wrong, u want to rearange the OHD fields
    thn
    select the field u want to insert and press the button "CUT"
    then place in the position and press "INSERT IN NEW ROW"
    Regards
    Md Zubair sha

  • Open Hub is not working after upgrade

    Hi Guys,
    I did an SAP BI upgrade from 7.0 to 7.3.
    The open hub destination was working fine before the upgrade, I could see the data coming in the file.
    After the upgrade, there is no data coming to the file. System says "no data available".
    Did anyone encounter this issue before?

    Hi,
    Did you follow the same? Try to look into this angle.
    Deleting Data from the Table
    With an extraction to a database table, you can either retain the history of the data or just
    store the new data in the table. Choose Delete Data from Table when defining your
    destination if you want to overwrite the fields. In this case, the table is completely deleted and
    regenerated before each extraction takes place. We recommend that you use this mode if you
    do not want to store the history of the data in the table. If you do not select this option, the
    system only generates the table once before the first extraction. We recommend that you use
    this mode if you want to retain the history of the extracted data.
    Note that if changes are made to the properties of the database table (for example, fields are
    added), the table is always deleted and regenerated.
    Regards,
    Suman

  • Open hub error when generating file in application server

    Hi, everyone.
    I'm trying to execute an open hub destination that save the result as a file in the application server.
    The issue is: in production environment we have two application servers, XYZ is the database server, and A01 is the application server. When I direct the open hub to save file in A01 all is working fine. But when I change to save to XYZ I´m getting the following error:
    >>> Exception in Substep Start Update...
    Message detail: Could not open file "path and file" on application server
    Message no. RSBO214
    When I use transaction AL11, I can see the file there in XYZ filesystem (with data and time correspondent to execution), but I can´t view the content and size looks like be zero.
    Possible causes I already checked: authorization, disk space, SM21 logs.
    We are in SAP BW 7.31 support package 6.
    Any idea what could be the issue or where to look better?
    Thanks and regards.
    Henrique Teodoro

    Hi, there.
    Posting the resolution for this issue.
    SAP support give directions that solved the problem. No matter in which server (XYZ or A01) I logon or start a process chains, the DTP job always runs in A01 server, and it causes an error since the directory doesn´t exist in server XYZ.
    This occurs because DTP settings for background job was left blank. I follows these steps to solve the problem:
    - open DTP
    - go to "Settings for Batch Manager"
    - in "Server/Host/Group on Which Additional Processes Should Run" I picked the desired server
    - save
    After that, no matter from where I start the open hub extraction, it always runs in specified server and saves the file accordingly.
    Regards.
    Henrique Teodoro

  • Open Hub Destination question

    Hi,
    We want to be able to extract data from a cube and would like to use Open Hub Destination for it. We also want to extract deltas.
    The question is ..
    Should we be making a single info package for this with delta capability ..
    or 2 separate info packages .. 1 for full mode and another for delta mode.
    Regards
    Vandana

    Hi,
    Yes what you said it is write
    from which cube do u want to retract the daat that cube will be the source
    first create the open hub in RSBO tcode and
    in open hub destination select aht and create the transformation between the table( to where u want to export)
    then create the DTP between the cube to the table
    and directly u can use the delta DTP first all the records will come and later onle delta records will be updated.
    Thansk & Regards,
    sathish

  • Open Hub Destination error

    Hi,
    I am working on Open Hub (first time ofcourse). And performing following steps:-
    Modeling>Open Hub> InfoArea> Create Open Hub Destination> Output to File .csv. field names>DIrectory> activate> Create transformation> activate Transformations>Create DTP..Here I am getting an error that destination Open Hub is not active.
    Can anyone please tell me am i missing any step here?
    Thanks

    Hi...
    Always bear in mind that the OHD, once created, should be saved and activated, before creating any transformations or DTPs for it.
    And each time the OHD is changed, all the objects corresponding to it... (The transformations and the DTPs) should be activated.
    Hope this helps.
    Thanks
    Sree Ramya Lanka

  • Open Hub in BW 3.5

    Hello Experts,
    I need to dump all the data (with all the characteristics) of an infocube onto a table.
    For this, I am implementing it through an infospoke.
    I am selecting all the 42 infoobjects which are present in my infocube.
    However I am getting an error which says that only 16 chars are allowed.
    Can someone please tell me as to how I could dump the entire data of the cube (with all the characteristics) onto a table through open hub.
    If not through open hub, could some one pls if there is any alternative method to dump the cube data onto a table.
    Regards
    Dipali

    Thanks Pramod and Praveen.Have assigned points
    My problem is solved and I am able to proceed further.
    I am now proceeding further to dump the data from the cube to the DB table.
    I have activated my infospoke.I now have the destination as say eg: /BIC/OHZTEST.
    However under se11, I am not able to see this table yet.
    Is there somethign that I am missing inorder to push the data from my cube to the DB table?
    PS: I have already selected the datasource as my cube
    Regards
    Dipali

Maybe you are looking for

  • Standard Cost Estimate for Subcontracting

    Dear Sir / Madam, We are in the process of migrating from our normal R/3 operations for Production Planning to Execution inclusive of Costing of materials by using iPPE functionality for repetitive scenario. We have our BOMs for painted body currentl

  • Audio transfer from fcp

    i have a couple of questions. sent one channel of audio to stp and it opens there as stereo. how to stop that? i can't seem to send a portion of a clip. stp keeps downloading the entire thing w/ the video. i am new at stp and i want to practice w/ sm

  • Movie Clips under mask

    I am having difficulty with some movie clips under a mask. I have a large movie clip with thumbnail photos underneath a mask. The thumnails slide up and down using a scroll bar that is linked through action script. When a thumbnail is clicked, it tri

  • After upgrading iOS to 7.0.4, I could not transfer:pdf files from email to ibooks ???

    Help anyone ??? After upgrading iOS to 7.0.4, I could not tranfer pdf files from email to ibooks. Previously, i would see a message asking for tranfering my word doc to pdf file and pdf file to store in ibooks...

  • Controlling Captivate variable values with AS2 Flash animation

    I created a couple of simple Pause/Resume Flash AS2 buttons to manipulate the Captivate playback using the following script: on (release) { _parent._parent.rdcmndResume = 1; on (release) { _parent._parent.rdcmndPause = 1; The buttons work as intended