Open Hub Selections

Hello,
Is it possible to change the selection parameter of an infospoke in a closed system?  We would like to have flexibility for the selection parameters for the data we extract.  It would be nice if variables were available but i'm pretty sure that is not the case.  Is there a way to run a single infospoke multiple times and change selection parameters each time?
Thanks,
TMS

You cant do this out of the box.
But there is a work around:
You need to write a program in BW in Se38. With selection parameters and this program can read the info spoke  ( be it a flat  file or table) and update back.
Ravi Thothadri

Similar Messages

  • Data selection in new Open Hub destination Vs Infospoke

    In BW 3.5, we can use T code RSBO to create Infospoke, In the selection tab, we can use any infoobject in the source infocube as the data filterfor the infoSpoke, e.g I can set company code as 1002 etc, while in the new  BI 7.0, Open Hub Destination,  where I can achieve the same by set the filter for a specific infoobject in the source cude?
    Edited by: Jianbai on Mar 29, 2010 11:49 PM
    Edited by: Jianbai on Mar 29, 2010 11:51 PM

    For Open Hub in BI7 we are creating transformation and DTP. The same resrtiction you can give in DTP (activate it after change).

  • Open Hub - variables in selection criteria

    Hi there Experts
    I would like to be able to use a variable in selesction criteria in DTP fro Open hub.
    To only extract previous month.  Is it possible to do this in the GUI of Open Hub.
    If not, any smart workaraounds?
    best regards
    Ingrid

    Good Morning Gueorgui.
    I am not using FISCPER, but posting date, so fiscvar should not be necessary.
    But is it possible to use a variable  like in BEX queries.  Like 0DAT or some Zvariable.
    In general OLAP variable in DTP.
    Thanks you.
    Ingrid

  • Open Hub: How-to doc "How to Extract data with Open Hub to a Logical File"

    Hi all,
    We are using open hub to download transaction files from infocubes to application server, and would like to have filename which is dynamic based period and year, i.e. period and year of the transaction data to be downloaded. 
    I understand we could use logical file for this purpose.  However we are not sure how to have the period and year to be dynamically derived in filename.
    I have read in sdn a number of posted messages on a similar topic and many have suggested a 'How-to' paper titled "How to Extract data with Open Hub to a Logical Filename".  However i could not seem to be able to get document from the link given. 
    Just wonder if anyone has the correct or latest link to the document, or would appreciate if you could share the document with all in sdn if you have a copy.
    Many thanks and best regards,
    Victoria

    Hi,
    After creating open hub press F1 in Application server file name text box from the help window there u Click on Maintain 'Client independent file names and file paths'  then u will be taken to the Implementation guide screen > click on Cross client maintanance of file name > create a logical file path by clicking on new entiries > after creating logical file path now go to Logical file name definition there give your Logical file , name , physical file (ur file name followed by month or year what ever is applicable (press f1 for more info)) , data format (ASC) , application area (BW) and logical path (choose from F4 selection which u have created first), now goto Assignment of  physical path to logical path > give syntax group >physical path is the path u gave at logical file name definition.
    however we have created a logical path file name to identify the file by sys date but ur requirement seems to be of dynamic date of tranaction data...may u can achieve this by creating a variable. U can see the help from F1 that would be of much help to u. All the above steps i have explained will help u create a dynamic logical file.
    hope this helps u to some extent.
    Regards

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • How to delete the Generated files from application server(open hub)?

    hi experts,
    when i try to execute process chain the DTP it is giving below dump. Exception CX_RSBK_REQUEST_LOCKED logged.
    when i execute the DTP manually and trying to delete the previous request, it is giving for dump ITAB_DUPLICATE_KEY.
    so to delete the generated file from application server, how to delete it for specific dates?
    Information on where terminated
    Termination occurred in the ABAP program "GPD6S3OE0BCVGC6L9DBNVYQARZM" - in
    "START_ROUTINE".
    The main program was "RSBATCH_EXECUTE_PROZESS ".
    In the source code you have the termination point in line 2874
    of the (Include) program "GPD6S3OE0BCVGC6L9DBNVYQARZM".
    The program "GPD6S3OE0BCVGC6L9DBNVYQARZM" was started as a background job.
    and when i check the dump it is point out at below code
    " Populate the lookup table for 0STOR_LOC
    SELECT * from /BI0/TSTOR_LOC
    into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
    FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
    STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    but the programme is syntactically correct only.
    how to rectify the issue.
    regards
    venuscm
    Edited by: venugopal vadlamudi on Sep 28, 2010 1:59 PM

    hi experts,
    We have written start routine to get the storage location text and sending to File located at Application server through OPEN HUB.
    here is the code written in the Transformations
    In the global section
    Text for 0STOR_LOC
        DATA: l_0stor_loc_text TYPE HASHED TABLE OF /bi0/tstor_loc
              WITH UNIQUE KEY stor_loc.
        DATA: l_0stor_loc_text_wa TYPE /bi0/tstor_loc.
    and in the code to get the text
    " Populate the lookup table for 0STOR_LOC
        *SELECT * from /BI0/TSTOR_LOC*
          into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
          FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
                  STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    im sure there is problem with the Routine only. i think i need to change the code if so please provide me the modified one.
    thanks
    venuscm
    Edited by: venugopal vadlamudi on Sep 29, 2010 9:37 AM

  • Can we set key figure in between 2 char in open hub destination.

    can any one tell me how can i insert a key description in between the char in open hub destinatio.
    Exp:my key description:quantity
    and char desc:material and plant.
    now i want field should be,
    Material.
    quantity
    plant
    when i am save the OHD ,i am getting the field like this,
    material
    plant
    quantity
    Can any one help me on this?

    Hi,
    If I am not wrong, u want to rearange the OHD fields
    thn
    select the field u want to insert and press the button "CUT"
    then place in the position and press "INSERT IN NEW ROW"
    Regards
    Md Zubair sha

  • Open Hub is not working after upgrade

    Hi Guys,
    I did an SAP BI upgrade from 7.0 to 7.3.
    The open hub destination was working fine before the upgrade, I could see the data coming in the file.
    After the upgrade, there is no data coming to the file. System says "no data available".
    Did anyone encounter this issue before?

    Hi,
    Did you follow the same? Try to look into this angle.
    Deleting Data from the Table
    With an extraction to a database table, you can either retain the history of the data or just
    store the new data in the table. Choose Delete Data from Table when defining your
    destination if you want to overwrite the fields. In this case, the table is completely deleted and
    regenerated before each extraction takes place. We recommend that you use this mode if you
    do not want to store the history of the data in the table. If you do not select this option, the
    system only generates the table once before the first extraction. We recommend that you use
    this mode if you want to retain the history of the extracted data.
    Note that if changes are made to the properties of the database table (for example, fields are
    added), the table is always deleted and regenerated.
    Regards,
    Suman

  • Open Hub Destination question

    Hi,
    We want to be able to extract data from a cube and would like to use Open Hub Destination for it. We also want to extract deltas.
    The question is ..
    Should we be making a single info package for this with delta capability ..
    or 2 separate info packages .. 1 for full mode and another for delta mode.
    Regards
    Vandana

    Hi,
    Yes what you said it is write
    from which cube do u want to retract the daat that cube will be the source
    first create the open hub in RSBO tcode and
    in open hub destination select aht and create the transformation between the table( to where u want to export)
    then create the DTP between the cube to the table
    and directly u can use the delta DTP first all the records will come and later onle delta records will be updated.
    Thansk & Regards,
    sathish

  • Open Hub in BW 3.5

    Hello Experts,
    I need to dump all the data (with all the characteristics) of an infocube onto a table.
    For this, I am implementing it through an infospoke.
    I am selecting all the 42 infoobjects which are present in my infocube.
    However I am getting an error which says that only 16 chars are allowed.
    Can someone please tell me as to how I could dump the entire data of the cube (with all the characteristics) onto a table through open hub.
    If not through open hub, could some one pls if there is any alternative method to dump the cube data onto a table.
    Regards
    Dipali

    Thanks Pramod and Praveen.Have assigned points
    My problem is solved and I am able to proceed further.
    I am now proceeding further to dump the data from the cube to the DB table.
    I have activated my infospoke.I now have the destination as say eg: /BIC/OHZTEST.
    However under se11, I am not able to see this table yet.
    Is there somethign that I am missing inorder to push the data from my cube to the DB table?
    PS: I have already selected the datasource as my cube
    Regards
    Dipali

  • Cube to Open Hub DB destination - Aggregation of records

    Hi Folks,
    I am puzzled with BW 7.0 open hub DB destination in regards to aggregation.
    In BW 3.5 open hub DB destination I got from the cube already aggregated records depending what which fields I select. E.g. cube has cal week/ cal month   but in infospoke only cal month selected I get only one record per month (not several ones -> one for each week of the month).
    In BW 7.0 open hub destination it seems to be different. Although Cal Week is not used in any transformation rule and not part of the destination defintion I still get all weeks of the month as single records. In theory not a problem if the records would be aggregated according the sematic key of the open hub destination db. But here an error is issues -> duplicated record short dump.
    So do I get it right that with BW 7.0 record aggregation e.g. Cal Week / Month  ---> Cal Month is not possible at all? Or do I something wrong?
    Will need to have an intermediate DSO in between or is there another way to get teh aggregation for open hub "direclty" working?
    This is quite a shortcoming of the open hub. Not mentioning the non-availability of nav_attributes + source only from cubes not from multiproviders ...  seems that the open hub in BW 7.0 got worse compared to BW 3.5
    Thanks for all replies in advance,
    Axel

    Hi Axel,
    We can use 0CALMONTH in open hub destination. In BI 7.0 we can not extract data from Multi Provider using Open Hub.
    But in BW 7.30 we have this functionality( using DTP we can extract data from multi provider to OHD).
    No need to use intermediate DSO, we can extract data directly from Info Cube.
    Please check the below documents.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/501f0425-350f-2d10-bfba-a2280f288c59?quicklink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/5092a542-350f-2d10-50bd-fc8cb3902e2e?quicklink=index&overridelayout=true
    Regards,
    Venkatesh

  • Open Hub  - Infosets as source and 'template fields' button

    Hi all,
    Two questions re Open Hub on 04s (we are on SP8).  I've surfed OSS, help.sap.com and SDN with no luck re answers to these:
    - I've defined an infoset as my source of an open hub destination.  It appears I must create a transformation, but it generates no proposal, and I'm not sure how/if I can add a virtual object (such as an infoset) as a source in a transformation.  It appears that the DTP requires this transformation to be defined.  Am I missing something?  So far, testing on all other objects has been successful.
    - How does one use the button 'Template Fields' when defining fields in an OHD?  I expected to see options to insert groups of fields (ex. all fields of an infoobject and its attributes), but I don't see any options available.
    Thanks!
    Lonnie

    Update:
    I was able to create the transformation (issue found btw keyboard and chair).
    New problem:  I am unable to activate my DTP.  When choosing the source for my DTP, I get a popup window (as normal) to choose the possible sources, but the infoset is not available for selection.  Only the 'pieces' of the infoset (the two DSO's that comprise the infoset), and this doesn't help me because my source is the infoset, not the indivudual tables of it. 
      Also, I cannot change the extraction mode to FULL.  Only delta is available.
    Thanks in advance for any help.

  • Open Hub Destination

    Hi All,
    I have created an Open Hub Destionation with file as an data targert.In the file which i have generated which is having volume as one of the filed,with the data 00015(data length of the volume is 5,thats why i am getting 00015.So i need not get 000 before the actual volume 15,is there any possibility to change the length of the infoobject in the open hub destiantion or in the transformation.
    Thanks In Advance
    Bobby

    Hi,
    While mapping in the transformation select  type as routine then write below code,
    Data :  v_OUTPUT  type string( or define someting for Numeric).
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_OUTPUT'
    EXPORTING
    INPUT = SOURCE_FIELDS-volume
    IMPORTING
    OUTPUT = v_OUTPUT .
    RESULT = v_OUTPUT.
    then actiavte the transformation .
    It would work...
    Regards,
    Satya

  • Open Hub services in BI 7.0

    Hi Experts,
    I am working on BI 7.0 and have a requirement where i have to extract data from a standard sales cube(0SD_C03).I will be using open hub services,but i also need to send the master data attributes along with this into the application file server.
    I did the following steps
    1) created an openhub destinationa as a CSV file and selected the fields i needed.(Cube has 80+,but i selected only 10 fields for my file).
    2) added an infoobject YINVGP which is an attribute of 0Material
    3) The transformations were automatically created by the system.
    4) Created a tranformation between 0MATERIAL and its Attribute YINVGP and selected the rule type as "Read Master Data".
    5) When i click on transfer values to activate the changes,its giving me the following error.
    Rule11(Target/BIC/YINVGP Group:01 Standard group) No master data characterstic selected.
    Rule(Target Field:/BIC/YINVGP) No source assigned.
    Can the experts confirmif i am missing anything?If so please suggest on how to get the master data using open hub services.
    Is this possible with open hub services,if so i would really appreciate anybody can pass me screenshots or step by step procedure,as i have not worked on open hub earlier.
    thanks in advance

    Hi Naga,
    I have seen the link you have sent me.It doesnt really help me.But thanks anyways.I am able to create a CSV file,which is not a problem.But my question is can i get the attributes of 0Material to appear in the same file.SAP has provided an option to read master data in the transformation rules,and when i try to use the rule,its giving me an error saying source has not been assigned.
    thanks

  • Error in Open Hub Destination Transport

    Hi All,
    I am facing the following problem in transporting the openhub destination.
    We have created a Open Hub Destination with following settings.
    Destination Type :- 'FILE'
    We have ticked the application Server name checkbox.
    In the server name we have selected our BI Development Server.
    In the type of File Name we have selected 'Logical File Name'. We have maintained the details in AL11 and FILE transactions.
    While transporting it to our Quality system, it throws an error saying it does not recognize the server name (which is true, the server name is different in quality system).
    Needed your guidance if there is any setting like source system conversion where it would change the name?
    Also if you could guide me regarding the best practices to transport the entries we have created in AL11 and FILE it would be realy nice. I did a search on this in SDN and could not find any useful stuff for my case.
    Thanks in advance.

    Hi,
    Your TR is failing, may be because in your target system, application server file system is different.
    go to transaction AL11 and check that. If same folder structure is not available, get it created, or go to RSA1 and try changing your infospoke settings to point to the application sever file as in your target system.
    You have to point this path to Q server.
    ~AK

Maybe you are looking for

  • SAP R/3 connection from KM Component

    Hello, We have the following requirement which we are trying to explore if it would be a possibility. Our Org Management structure in SAP R/3 stores information like User's manager and other details. Now we would like to install KM component within E

  • Foreign languages on 8220

    I would like to be able to input other European languages (besides French, German, Italian and Spanish) on the phone. Is there a way to download appropriate character support?

  • HT1657 Can I watch a rented movie on a flight with out wifi?

    Not sure if the rented movie is downloaded    do I need wifi to be able to view it inflight?

  • [Favourite Top Selling] Albums, Songs, Playlists

    Hello! Here you can share your Favourite Biggest Top Selling Albums, Songs or Playlists (more than 1 million of copies)   Top 40 Biggest Selling Singles by Female Artistshttps://open.spotify.com/user/ehabbb/playlist/0XpsDNKPk5xCfl5TT71DUu  spotify:us

  • WLC Controllers with Multiple DHCP Servers

    Hello All, I have a central office with (2) 4402 WLC's and about 25 Branch offices throughout the country. Currently all AP's are static IP'd with IP's from the local Branch office subnets. However the Clients all pull their IP's from a Central DHCP