Open hub issues

Hi,
open hub issues occured in the process chain. Some of the files didnt reach to the external customer directories. so i want to know where can i  insert the abap code 7.0 so that external tool picks up the files once all files have been reached to BWP/Interface in T-code AL11.
In 3.5 you can insert the code in infosporke using BADI. Only possibility i think of transformations but not sure where to insert it.
Thanks in advance
krish

Hi Krish,
It is my understanding that the files would not be available on the application server until after the transformation step has
been completed. Therefore it would not be useful to create a routine in the transformation. 
Maybe you could create a program that could be inserted in the subsequent step of the process chain?
Best Regards,
Vincent

Similar Messages

  • Open Hub Objects Tranport Issue

    Hi,
    Environment - BI 7, SP level - 17.
    Trying to transport Open Hub destination DTP object (DSO to Flat File) from the Dev to QA box. Output file name is defined (Created Logical file) using Tcode - FILE. The Logical File Name is available in Dev, QA.
    In Dev box, we are able to write Output to a FLAT File (Application Server).
    While transporting DTP getting Transport error - Return Code - 12. (No clear message for this Error Message or Dump file created).
    Also applied OSS Note - 1175391 and 1241173.
    To overcome this issue, your valuable suggestions highly appreciated.
    Thanks,
    Madhu.

    Hi Madhu,
    I think the below link will helps you...
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/d063282b-937d-2b10-d9ae-9f93a7932403
    Regards
    Sudheer

  • Open hub destination issue

    Hi,
    In our project we have Client Instance (eg: BWPD) and Application server (eg: BWPRD) defined for load balancing.
    We have created open hub and assigned destination server BWPD.
    When i execute the DTP manually in BWPD, it runs succesfully.
    However, the same DTP when placed in the process chain fails with error message :
    No Such File or Directory
    Could not open file D:\usr\sap\BWPD\A01\work\Material on application server
    Error while updating to target ZXXXX.
    Options Tried:
    Schedule process chain in the background server BWPD (same server which has been mentioned in Open hub dest) still DTP failed.
    Tried with Application server it failed.
    Tried with HOST as option it failed.
    couldn't make out what is going wrong. Any thoughts ?
    Regards.

    Hi there,
    found that doc quite useful.
    Maybe could shed some light to your issue.
    [Creating  Open Hub Destination  using a Logical file to extract the data|Creating  Open Hub Destination  using a Logical file to extract the data]
    Also, what OS do you have?
    is the Syntax Group accordingly created ?

  • Issue for DTP from DSO to open hub destination

    Hello Gurus,
            I have a issue for DTP from DSO to open hub destination, long text for error in the monitor is as follows:
              " Could not open file
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2 on application server"
              " Error while updating to target ZFIGLH03 (type Open Hub Destination)     "
          for open hub destination, I check the configure for logical file name , which is "
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr",
    I am wondering where that file " 
    SAPBITFS\tst_bi\bit\BIWork\GENLGR_OneTimeVendr_2" in the error message comes from?
    Many thanks,

    Hi
    You do not need to create a file in application server. It will be created automatically.
    But if you have defined a logical file name in tcode FILE and used that in OHD and if it is not correct then it will show a conflict. Check this out.

  • Issue in transport regarding Open Hub Services

    Hi All,
    I have a strabge scenario, where I am moving one of my open hub related transport from system test env to UAT,but each time it's failing.
    The story behind is: I have a infospoke based on one ODS,where I have included a field called BRAND in the source structure.
    This transport went fine from Dev to Sytem test env,but it's failing while moving from System test to UAT.
    Error is.
    Program ZCL_IM_ZLP_WRKF===============CP, Include ZCL_IM_ZLP_WRKF===============CM001: Syntax error in line 000085
    The data object 'WA_SOURCE' does not have a component called '0IS_CLAIM__BRANDVALE'.
    Any comment will be appriciated.
    Regards,
    Kironmoy Banerjee.

    Hi,
    Generally transports fails with different situations like, by sequence and dependecy missing / with in active objects/ with RFC issue.
    so, we should follow the sequence and dependency while transporting the objetcs and they should be in active state in the source first.
    In your case, check your source fields whether they mapped correctly or not and it should be in active state.
    as per your error:
    The data object 'WA_SOURCE' does not have a component called '0IS_CLAIM__BRANDVALE'.
    Check source "'WA_SOURCE", why does not have the component called "0IS_CLAM_BRANDVALE". I think the souce missing the component called ""0IS_CLAM_BRANDVALE".
    see whether it was mapped correctly/active or not  and follow the seqence if there are any dependancy and retransport the request.
    Regards.
    Rambabu

  • Open Hub DTP issue (Error 21 when writing to the local workstation)

    Hi,
    I am using an openhub destination to output my csv file. Its a local workstation share folder. Data Provider got around 50000 rows (expecting a 40MB file). When ever I execute the DTP, it run for first data package and fell down with below error message.
    Error 21 when writing to the local workstation
    Error while updating to target XXXXXXX (type Open Hub Destination)
    Operation  could not be carried out for
    Exception CX_RS_FAILED logged
    Exception CX_RS_FAILED logged
    Sometimes the file get created with partial entries, most of the times the file wont even get created. I created new dtps and tried the same. Changed the package size, servers, parallel processes etc, but of no use.
    Thanks Neo

    Hi Neo,
    Did you try to define the logical/physcal file path from tcode FILE? If not can you try to use this option and see if it works.
    Also can you change your DTP settings by reducing the data packet size to see if your facing this error message?
    Thanks
    Abhishek Shanbhogue

  • Open Hub Destination Delta issue

    Hi experts,
    We have a scenario in which a Open hUb is created on top of a cube.
    Regular delats are moving from this cube to Open hub daily.
    But as per the request, we had to dump the cube data, and the reload took, 2 days to complete.
    So, that 2 days, data got missed in Open hub.
    Now, as the new request are not datamarted, next time wen the DTP between Cube and Open Hub runs, it will fetch all the requests right. we dont want that.
    So, can we run a init w/o data transfer so that source status will be fetched?
    but then how to load that 2 days data alone.
    Kindly advise.
    Edited by: neethacj on Mar 7, 2012 1:06 PM

    If the cube doesnt have any other targets aprart from this OHD,
    delete the two days requests in the cube, set the init from cube to ohd
    load the cube for delta( brings two days requests) and run the ohd

  • Open Hub Destination with Application Server Issue

    Hi all,
    I have requriment that i need to save my monthly data in application server with the help of OPen Hub Destination.I have created an DSO
    on that DSO i have created an Open HUb with application server as target.When i store my every month file in application server that file name should be a per the system date.
    So i have Created an LOgical File----ztest_data
    Assgined Physical path to Logical File-----/usr/sap/<sysid>........
    created an Logical File name------Ztest_data_file....
    IN the open HUb i have selected Aplication Server as My target and i have checked the box.
    I have selected Logical FIle name and given the file name which i have created-----ztest_data_file...
    Runned DTP it is showing no errors,but when i check in AL11 ,in the path which i have given(/usr/sap/<sysyid>........,i couldnt find the file.....
    Please check this and let me know...
    Thanks In Advance,
    Bobby.

    Hi,
    Please check the link /people/jyothi.velpula/blog/2010/01/20/creating-open-hub-destination-using-a-logical-file-to-extract-the-data .
    Also is your DTP full or delta. Try running the DTP as Full.
    Hope it helps.
    Best Regards,
    Kush Kashyap

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • How to delete the Generated files from application server(open hub)?

    hi experts,
    when i try to execute process chain the DTP it is giving below dump. Exception CX_RSBK_REQUEST_LOCKED logged.
    when i execute the DTP manually and trying to delete the previous request, it is giving for dump ITAB_DUPLICATE_KEY.
    so to delete the generated file from application server, how to delete it for specific dates?
    Information on where terminated
    Termination occurred in the ABAP program "GPD6S3OE0BCVGC6L9DBNVYQARZM" - in
    "START_ROUTINE".
    The main program was "RSBATCH_EXECUTE_PROZESS ".
    In the source code you have the termination point in line 2874
    of the (Include) program "GPD6S3OE0BCVGC6L9DBNVYQARZM".
    The program "GPD6S3OE0BCVGC6L9DBNVYQARZM" was started as a background job.
    and when i check the dump it is point out at below code
    " Populate the lookup table for 0STOR_LOC
    SELECT * from /BI0/TSTOR_LOC
    into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
    FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
    STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    but the programme is syntactically correct only.
    how to rectify the issue.
    regards
    venuscm
    Edited by: venugopal vadlamudi on Sep 28, 2010 1:59 PM

    hi experts,
    We have written start routine to get the storage location text and sending to File located at Application server through OPEN HUB.
    here is the code written in the Transformations
    In the global section
    Text for 0STOR_LOC
        DATA: l_0stor_loc_text TYPE HASHED TABLE OF /bi0/tstor_loc
              WITH UNIQUE KEY stor_loc.
        DATA: l_0stor_loc_text_wa TYPE /bi0/tstor_loc.
    and in the code to get the text
    " Populate the lookup table for 0STOR_LOC
        *SELECT * from /BI0/TSTOR_LOC*
          into CORRESPONDING FIELDS OF table L_0STOR_LOC_TEXT
          FOR ALL ENTRIES IN SOURCE_PACKAGE WHERE
                  STOR_LOC = SOURCE_PACKAGE-STOR_LOC.
    im sure there is problem with the Routine only. i think i need to change the code if so please provide me the modified one.
    thanks
    venuscm
    Edited by: venugopal vadlamudi on Sep 29, 2010 9:37 AM

  • Open Hub is not working after upgrade

    Hi Guys,
    I did an SAP BI upgrade from 7.0 to 7.3.
    The open hub destination was working fine before the upgrade, I could see the data coming in the file.
    After the upgrade, there is no data coming to the file. System says "no data available".
    Did anyone encounter this issue before?

    Hi,
    Did you follow the same? Try to look into this angle.
    Deleting Data from the Table
    With an extraction to a database table, you can either retain the history of the data or just
    store the new data in the table. Choose Delete Data from Table when defining your
    destination if you want to overwrite the fields. In this case, the table is completely deleted and
    regenerated before each extraction takes place. We recommend that you use this mode if you
    do not want to store the history of the data in the table. If you do not select this option, the
    system only generates the table once before the first extraction. We recommend that you use
    this mode if you want to retain the history of the extracted data.
    Note that if changes are made to the properties of the database table (for example, fields are
    added), the table is always deleted and regenerated.
    Regards,
    Suman

  • Open hub error when generating file in application server

    Hi, everyone.
    I'm trying to execute an open hub destination that save the result as a file in the application server.
    The issue is: in production environment we have two application servers, XYZ is the database server, and A01 is the application server. When I direct the open hub to save file in A01 all is working fine. But when I change to save to XYZ I´m getting the following error:
    >>> Exception in Substep Start Update...
    Message detail: Could not open file "path and file" on application server
    Message no. RSBO214
    When I use transaction AL11, I can see the file there in XYZ filesystem (with data and time correspondent to execution), but I can´t view the content and size looks like be zero.
    Possible causes I already checked: authorization, disk space, SM21 logs.
    We are in SAP BW 7.31 support package 6.
    Any idea what could be the issue or where to look better?
    Thanks and regards.
    Henrique Teodoro

    Hi, there.
    Posting the resolution for this issue.
    SAP support give directions that solved the problem. No matter in which server (XYZ or A01) I logon or start a process chains, the DTP job always runs in A01 server, and it causes an error since the directory doesn´t exist in server XYZ.
    This occurs because DTP settings for background job was left blank. I follows these steps to solve the problem:
    - open DTP
    - go to "Settings for Batch Manager"
    - in "Server/Host/Group on Which Additional Processes Should Run" I picked the desired server
    - save
    After that, no matter from where I start the open hub extraction, it always runs in specified server and saves the file accordingly.
    Regards.
    Henrique Teodoro

  • Cube to Open Hub DB destination - Aggregation of records

    Hi Folks,
    I am puzzled with BW 7.0 open hub DB destination in regards to aggregation.
    In BW 3.5 open hub DB destination I got from the cube already aggregated records depending what which fields I select. E.g. cube has cal week/ cal month   but in infospoke only cal month selected I get only one record per month (not several ones -> one for each week of the month).
    In BW 7.0 open hub destination it seems to be different. Although Cal Week is not used in any transformation rule and not part of the destination defintion I still get all weeks of the month as single records. In theory not a problem if the records would be aggregated according the sematic key of the open hub destination db. But here an error is issues -> duplicated record short dump.
    So do I get it right that with BW 7.0 record aggregation e.g. Cal Week / Month  ---> Cal Month is not possible at all? Or do I something wrong?
    Will need to have an intermediate DSO in between or is there another way to get teh aggregation for open hub "direclty" working?
    This is quite a shortcoming of the open hub. Not mentioning the non-availability of nav_attributes + source only from cubes not from multiproviders ...  seems that the open hub in BW 7.0 got worse compared to BW 3.5
    Thanks for all replies in advance,
    Axel

    Hi Axel,
    We can use 0CALMONTH in open hub destination. In BI 7.0 we can not extract data from Multi Provider using Open Hub.
    But in BW 7.30 we have this functionality( using DTP we can extract data from multi provider to OHD).
    No need to use intermediate DSO, we can extract data directly from Info Cube.
    Please check the below documents.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/501f0425-350f-2d10-bfba-a2280f288c59?quicklink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/5092a542-350f-2d10-50bd-fc8cb3902e2e?quicklink=index&overridelayout=true
    Regards,
    Venkatesh

  • Open Hub does not work after upgrade

    We just upgraded our BW production system from 3.0b into 3.5 and we are running support package level 09. Now we are facing serious problems with Open Hub.
    After upgrade Open Hub delivering data from infocube into
    csv-files does not work. When running the InfoSpoke it
    will deliver 0 data records even though it should bring considerable amount of data.
    There is no error message or short dump, it just do not deliver any data. InfoCube has data in place and it is available for reportoing. Extraction is made in FULL-mode and there is no transformations. Funny thing is that extraction from ODS or InfoObject works perfectly fine.
    Has anyone idea how to solve this?
    Thanks in advance,
    JL

    We ran into the same issue as Jari Laine today.
    We just upgraded our BW sandbox environment from 3.0b into 3.5. We are running support package level 13, though. Now we are facing serious problems with Open Hub.
    After the upgrade, our Open Hub which delivers data from an info cube into CSV-files does not work any more.
    When running the Info Spoke it will deliver 0 data records even though it should bring a considerable amount of data.
    There is no error message or short dump, it just does not deliver any data. The Info Cube has data in place and it is available for reporting.
    The Extraction is made in FULL-mode and there are no transformations.
    Has anyone an idea how to solve this?
    Thanks in advance,
    Marc

  • Open hub destination with datasource as the source

    Hi,
    I am trying to use the new features of BI for open hub destination where you can use datasource as the source for extraction.
    I have created an infopackage, loaded data to PSA and from there to open hub destination via the DTP.
    When I try to execute the DTP, I get the following error:
    <b>Datapackage processing terminated.
    Error in substep
    You are not authorized to use this transformation.</b>
    I have been able to use the DTP to load open hub destination targets with datastore and infocubes as source.
    Also when we turn on the security trace, it doesn't fail anywhere.
    Has anybody faced a similar problem?
    Any help appreciated,
    Thanks,
    Payal.

    I have created to OHD - one for a 3.5 data source and one for a 7.0 data source (both are different data sources). While creating DTP for the 7.0 data source, it did create a default mapping for transformation.
    But I still get same error in both cases.
    When I create the DTP, it tries to create a transformation. But I get an information message :
    Exception CX_RS_MSG occurred (program: CL_RSBK_PATH==================CP, include: CL_RSBK_PATH==================CM00I, line: 16).
    I can still activate the DTP and when I run it says I don't have authorization to this transformation.
    I think there are issues while creating the transformation, but I don't know what!
    -Payal.

Maybe you are looking for

  • Snow Leopard 10.6.8 Combo update problem

    I have an older Black MacBook from 2008 (4,1).  I was always on 10.6.6 as I only really did use this machine for live on stage performance and it has been rocksolid.  I decided to to apply the 10.6.8 combo update finally.  Everything went smooth but

  • UCM Server libraries and application failed during startup

    When i am going to start UCM_Server managed server (a instance of UCM) from Administrative Console, server started but application startup failed. Similarly, when i start this server using startManagedWebLogic startup script, server as well as applic

  • Macbook wont boot from new hard drive.

    I installed a new Western Digital 500GB hard drive into my Macbook Pro (2006 edition). The computer boots up, but it just stays at the screen with the apple and the thing circling underneath it. It won't go any further than that. I have to boot from

  • USkip / uInfo Logging

    Hi there, Question, when using either of these two functions with the JAVA engine, have you been able to get the output to the log file? Where does this info appear. uWarning info does go in, but of course it's coded Yellow and I don't always want th

  • Better performance with lower CPU_COUNT ?

    Hi We had big performance problems on our production database, after lots of tunings and configuration we still didn't reach anywhere (oltp takes 9 hours when it take 20 minutes on test db). production server Dual Xeon 2.8 2GB RAM RAID5 10k/s RPM Win