Open Hub Delta Records

Hi,
Can we able download Delta Records from ODS / Cube by using Openhub , please some one tell me
Thanks,
GAL

Hi,
Yes it is possible,
1)create an Open hub on the top of Cube/DSO and create a Delta DTP and execute
2)Select the Destination type as CSV file or Database table
If you select CSV file as Destination type
1)If you want to download it to desktop or to your local machine, then click on the directory and chose the path.
2)If you want to download it to Application Directory, then Check the button Application server, you can see the directory path automatically, now the file will be placed in the in that particular directoy.
If you select Database table as Destination type
1)Then you can see the database table name on the screen , goto se11 and download the data fomr that partiuclar table

Similar Messages

  • Open Hub, Delta Transformation

    Hi All,
    Thanks in advance for your help.  I am extracting data from a DSO to a flat file using an Open hub and a delta DTP.  The problem I have is that if there is no delta (i.e. zero new/changed records) since the last time the open hub ran, my file is not cleared.
    For example,  If I ran the DTP in the morning and received 100 records, the open hub creates a file in the UNIX directory with a 100 records.  However if in the evening I run the DTP again and there are no new records, the file is cleared, and remains with the 100 records.  This is causing a problem down the line because the data is sent twice.
    Is there anyway that the DTP can output blank files? or can the file be cleared out in a different step in the process chain?
    your help is very appreciated.
    Thanks

    Hi,
    This note should correct your problem: 1111259
    Summary
    Symptom
    You extract data using a Delta data transfer process (DTP) to an open hub destination of type 'File on application server'. If the source does not contain any new data, the existing file is not deleted.
    Reason and Prerequisites
    This problem is caused by a program error.
    Solution
    SAP NetWeaver 7.0 BI
               Import Support Package 17 for SAP NetWeaver 7.0 BI (BI Patch 17 or SAPKW70017) into your BI system. The Support Package is available when Note 1106569 "SAPBINews BI7.0 Support Package 17", which describes this Support Package in more detail, is released for customers.
    In urgent cases, you can implement the correction instructions as an advance correction.
    Hope this helps.
    Regards,
    Diego

  • DTP Delta request for open hub destination.

    Hi expert,
    As you know, In BW 7.0, open hub destination is used for open hub service, and you can create tranformation/DTP for the destination. and in some case the DTP delta is enabled.
    my questions are
    1) in the context menu of open hub destination, I did not find something like manage, which could list all the DTP request executed for this open hub destination..but for normal infocube or ods, you can find a list.
    2) if the DTP could enable the delta, where can I delete the latest delta request, and have a new load..
    3) I assume the DTP could be excuted with batch integrated with process chain by process type 'DTP', right ? if the destination type is third party tool, can it also be integrated with process chain ?
    4) can give some obvious difference of InfoSpoke and open hub destination in terms of advantage and disadvantage..
    Thanks a lot for the feedback.
    Best Regards,
    Bin

    Hi ,
    Please look at the links below . hope it helps .
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d063282b-937d-2b10-d9ae-9f93a7932403?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9?QuickLink=index&overridelayout=true
    Regards
    Santosh

  • Cube to Open Hub DB destination - Aggregation of records

    Hi Folks,
    I am puzzled with BW 7.0 open hub DB destination in regards to aggregation.
    In BW 3.5 open hub DB destination I got from the cube already aggregated records depending what which fields I select. E.g. cube has cal week/ cal month   but in infospoke only cal month selected I get only one record per month (not several ones -> one for each week of the month).
    In BW 7.0 open hub destination it seems to be different. Although Cal Week is not used in any transformation rule and not part of the destination defintion I still get all weeks of the month as single records. In theory not a problem if the records would be aggregated according the sematic key of the open hub destination db. But here an error is issues -> duplicated record short dump.
    So do I get it right that with BW 7.0 record aggregation e.g. Cal Week / Month  ---> Cal Month is not possible at all? Or do I something wrong?
    Will need to have an intermediate DSO in between or is there another way to get teh aggregation for open hub "direclty" working?
    This is quite a shortcoming of the open hub. Not mentioning the non-availability of nav_attributes + source only from cubes not from multiproviders ...  seems that the open hub in BW 7.0 got worse compared to BW 3.5
    Thanks for all replies in advance,
    Axel

    Hi Axel,
    We can use 0CALMONTH in open hub destination. In BI 7.0 we can not extract data from Multi Provider using Open Hub.
    But in BW 7.30 we have this functionality( using DTP we can extract data from multi provider to OHD).
    No need to use intermediate DSO, we can extract data directly from Info Cube.
    Please check the below documents.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/501f0425-350f-2d10-bfba-a2280f288c59?quicklink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/5092a542-350f-2d10-50bd-fc8cb3902e2e?quicklink=index&overridelayout=true
    Regards,
    Venkatesh

  • How to load delta, changing the name of file for every day with open hub?

    Hi gurus,
    I load daily delta data from ODS to flat file, using open hub.
    The file is located on server. Now, if I do delta load from ODS to flat file, every day the file is overwritten, because it has the same name.
    I want to load delta data everyday to different file name, for example BW_20060101 , BW_20060102, BW_2006_01_03 ....
    How do I do that?
    Best regards,
    Uros

    Hi Uros,
    This thread may give you some idea
    Changing the file name n Flat file Extraction
    https://websmp107.sap-ag.de/~form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700002201112003E
    Regards,
    BVC

  • Open Hub Header and Trailer Record.

    Hi,
    For the Open Hub Destination
    Destination Type is File,
    Application server.
    Type of File Name: 2 Logical File name.
    How to to the get the Header and Trailer Record which will contain Creation Date, Creation Time and Total number of records.
    Header record Layput :
    Creation Date (YYYYMMDD)
    Creation Time (HHMMSS)
    Total record Count
    Trailer record Layout:
    Total number of Records in the file and
    XXXX( Key Figure ) Total.
    Thanks in advance.
    Regards,
    Adhvi.

    Hi Venkat,
    write a UDF in following way...
    pass the first parameter as the detail node (cache the whole queue) to the UDF pass the second parameter as the trailer countto the UDF
    now loop through the detail records get the count with a counter variable
    check the counter against the trailer count outside the loop
    if it doesnot match trigger the alert from the UDF itself
    Check the below link for triggering alert from an UDF
    /people/bhavesh.kantilal/blog/2006/07/25/triggering-xi-alerts-from-a-user-defined-function

  • Delete the initialization of delta on open hub table

    HI Gurus
    I have loaded data from cube to open hub table using DTP with full update, later on i have loaded the data using a DTP with delta update. Now i am thinking that delta has been initialized on the open hub table, so now want to delete the initialization of delta on the open hub table. It would be great if some one could send me the response. 
    Thanks
    Rohit

    Hi Sam,
    If I understand you correctly,
    You have removed your Initialisation by going InfoPackage --> Scheduler --> Initialization Options for Source --> Removed the Request
    Then you did Init without Data Transfer
    Then you have the option from init withoout Data transfer to Delta in InfoPackage and Loaded the Data.
    Check in the Process Chain the InfoPackage you have mentioned in the selections may pointing towards Init load instead of Delta.
    Sudhakar.

  • Open Hub Destination Delta issue

    Hi experts,
    We have a scenario in which a Open hUb is created on top of a cube.
    Regular delats are moving from this cube to Open hub daily.
    But as per the request, we had to dump the cube data, and the reload took, 2 days to complete.
    So, that 2 days, data got missed in Open hub.
    Now, as the new request are not datamarted, next time wen the DTP between Cube and Open Hub runs, it will fetch all the requests right. we dont want that.
    So, can we run a init w/o data transfer so that source status will be fetched?
    but then how to load that 2 days data alone.
    Kindly advise.
    Edited by: neethacj on Mar 7, 2012 1:06 PM

    If the cube doesnt have any other targets aprart from this OHD,
    delete the two days requests in the cube, set the init from cube to ohd
    load the cube for delta( brings two days requests) and run the ohd

  • How to send  Million records using  Open hub using  into my own file server

    hi
    i am using OPEN HUB process to send  6 miilion record to my own network server. is there any limitaiton how i need to go about that? my process is failing when i execute DTP openhub. i am abel to send few records to sap aplication server.
    did any body send  more than million records to aplication server out side SAP ? how to achieve this. pelase help me out..
    i am trying to send ods records into file server

    I'm glad you solved your problem.
    Generally it is nice to post how you solved your problem so that when others have a similar problem and search the archives, they can see your solution.
    Thanks

  • Open hub - output structure enhancement - will delta need to disturb ?

    Hi,
    We have done some changes to the Open hub output structure and added few output fields, now we need to move it to production. I can see in production it is delta enabled i.e. only newly added data is written in output file.
    Now, I need to move new structure to the Production, I wanted to know shall I need to distrube the delta management for Open hub output, ot it will taken care automatically.
    Kindly advice.

    Thanks,
    Which is the best ABAP function to extract an SAP table to a flat file??
    Because at this point, I have put in <b>IF_EX_OPENHUB_TRANSFORM~TRANSFORM</b>
    the following:
    method IF_EX_OPENHUB_TRANSFORM~TRANSFORM.
    data: l_s_data_in type /BIC/CYZEPC_OH_EPCOT_PROSPER,
          l_s_data_out type ZEPC_I_EPCOT_PROSPER.
      data l_dummy.
      message i899(rsbo) with 'exit started' into l_dummy.
      i_r_log->add_sy_message( ).
      clear e_t_data_out.
      loop at i_t_data_in into l_s_data_in.
        move: l_s_data_in-/BIC/ZEPCC0018 to l_s_data_out-MARQUEFIL,
              l_s_data_in-/BIC/ZEPCC0005 to l_s_data_out-GENRE,
              l_s_data_in-CALMONTH       to l_s_data_out-MOISFERME,
              l_s_data_in-/BIC/ZEPCC0008 to l_s_data_out-TYPPROSPER,
              l_s_data_in-CALYEAR        to l_s_data_out-ANNEE,
              'ABC'                 to l_s_data_out-COD_MVT.
        insert l_s_data_out into table e_t_data_out.
      endloop.
      message i899(rsbo) with 'exit finished' into l_dummy.
      i_r_log->add_sy_message( ).
    endmethod.
    So now my <u>destination DB ta</u>ble, <b>/BIC/OHZEPC_OH_E</b> (same as structure <b>ZEPC_I_EPCOT_PROSPER</b> above) should have the data inside. 
    Can I now just insert a fuction at the end to download the table to flat file???
    Thanks,
    Tim

  • Open hub delete delta

    Hello,
    I'm using open hub in BI 7.0. How does the delta queue work for this tool?
    I'm extracting data from a cube to a flat file with delta. I'm storing the flat file on the SAP server for further processing.
    How do I handle re-runs of delta. For example if something goes wrong in the load, how do I re-run the last delta without having to reload the entire content of the source cube?
    Thanks,
    F C

    Hello again,
    But I cannot delete the request from the cube. This is the source of the load. I do not wish to delete the source data, but only the "delta queue" in order to be able to reload. Please elaborate further..
    I have noticed that setting the monitor to red will delete the target file. Furthermore, when executnig the DTP again, alla data is loaded again (and not only the delta).
    Any ideas?
    Regards,
    F C

  • Count of records during Open Hub???

    Hi all
    I am using open Hub of type File, and downloaded the excel file on to my desktop. I got two files one is header and other is data file.
    By any chance can I display  the number of records that got retrieved from data file ?

    Hi,
    You can see the number of records in the DTP monitor.
    Just as a quick reference when dealing with open hubs, you can refer the below article as well -
    [http://www.sdn.sap.com/64818C29-16C2-4EAA-81D3-D07D32D84401/FinalDownload/DownloadId-73D94721B9FE43064C599C7EDB1F1CE4/64818C29-16C2-4EAA-81D3-D07D32D84401/irj/scn/go/portal/prtroot/docs/library/uuid/00d05236-08ad-2d10-2297-aa7cec6766a3?QuickLink=index&overridelayout=true]
    Hope this helps.
    Regards,
    RahulM

  • Open Hub Destination question

    Hi,
    We want to be able to extract data from a cube and would like to use Open Hub Destination for it. We also want to extract deltas.
    The question is ..
    Should we be making a single info package for this with delta capability ..
    or 2 separate info packages .. 1 for full mode and another for delta mode.
    Regards
    Vandana

    Hi,
    Yes what you said it is write
    from which cube do u want to retract the daat that cube will be the source
    first create the open hub in RSBO tcode and
    in open hub destination select aht and create the transformation between the table( to where u want to export)
    then create the DTP between the cube to the table
    and directly u can use the delta DTP first all the records will come and later onle delta records will be updated.
    Thansk & Regards,
    sathish

  • Open hub destination ---Additional property needed--

    Hi,
    I have one open hub destination object which is writing data from infocube to an external file (in appliaction server ) and the file is distributed to customers via Email---
    This is absolutely working fine.
    Now, i also need to extract the data from other datasource to the same cube ( which are nothing but corrections to original data existing in cube ) and the corrected data needs to be updated to the file in applucation server!!
    as overwriting is not possible in infocubes i wonder how i could make the corrected data to be updated to the external files

    Hi,
    Try to load DELTA to the cube from the datasources. By this only the changes will be added to the already existing records.
    Else load the data from the cube to a new DSO and the load the data from other datasources to the same DSO. The latest correct data from other datasources will over-erite the existing data in the DSO.
    Use this new DSO in the openHub.
    Regards,
    Balaji V

  • Open Hub 3rd Party - Process Chain Automation

    Hi,
    I have a requirement to fulfill the Open Hub extraction to 3rd party SQL Server DB using process chain automation.
    There are at times no deltas and during the execution of DTP since there are no delta's the request status shows GREEN but overall status as YELLOW because there are no new records.
    Until we clear this request to GREEN, we cant trigger the next load. I am doing these steps manually by going to tables/ FM's to change the status.
    I am looking for options for automation in process chain can someone help me on this request. Appreciate your help!
    Thanks,
    Pandu.

    Do you know when the delta will being zero records? Also when you say the overall status is yellow which status are you talking about?
    You can always use a ABAP program in your PC which will check for the latest request ID has records and turn the the status green so that dependent  data triggers are sent
    Thanks
    Abhishek Shanbhogue

Maybe you are looking for

  • Oracle 10g on Solaris 5.10 x86

    Hi, I just tried installing oracle 10g(10.2.0.1) on my new machine(Sun solaris 10 x86) Intel 915 GAG with 512 RAM and 120GB SATA drive. After I started running ./runInstaller the following message came on console...... Starting Oracle Universal Insta

  • Red Screen

    In the last three weeks, at different times, my iMac screen has turned red. A red tint along with red tinted icons and in whatever program I opened. Tonight it turned red while I was paying bills on my bank's web site, the other times I was doing oth

  • Delivery Date Format in PO !!!

    Hi Friends , I am uploading the delivery date from an input file to create a PO using BAPI_PO_CREATE1 . Before the BAPI call i see in t_poschedule-delivery_date = 08012006.This is the correct date which needs to be uploaded in PO .  But after the BAP

  • Calling ERP Web Dynpro from CRM Opptunity with parameters

    Hi, experts! In CRM7.0, I'm calling ERP Web Dynpro from CRM Opptunity. We've added a button in Opptunities and the buttom can navigate into ERP Web Dynpro.It works fine. Now, I'm trying to move the value of Opptunity ID to a field in ERP Web Dynpro w

  • About the sort method of arrays

    Hi, I found that for primitive types such as int, float, etc., algorithm for sort is a tuned quicksort. However the algorithm for sorting an object array is a modified merge sort according to the java API. What is the purpose of using two different s