Open hub - output structure enhancement - will delta need to disturb ?

Hi,
We have done some changes to the Open hub output structure and added few output fields, now we need to move it to production. I can see in production it is delta enabled i.e. only newly added data is written in output file.
Now, I need to move new structure to the Production, I wanted to know shall I need to distrube the delta management for Open hub output, ot it will taken care automatically.
Kindly advice.

Thanks,
Which is the best ABAP function to extract an SAP table to a flat file??
Because at this point, I have put in <b>IF_EX_OPENHUB_TRANSFORM~TRANSFORM</b>
the following:
method IF_EX_OPENHUB_TRANSFORM~TRANSFORM.
data: l_s_data_in type /BIC/CYZEPC_OH_EPCOT_PROSPER,
      l_s_data_out type ZEPC_I_EPCOT_PROSPER.
  data l_dummy.
  message i899(rsbo) with 'exit started' into l_dummy.
  i_r_log->add_sy_message( ).
  clear e_t_data_out.
  loop at i_t_data_in into l_s_data_in.
    move: l_s_data_in-/BIC/ZEPCC0018 to l_s_data_out-MARQUEFIL,
          l_s_data_in-/BIC/ZEPCC0005 to l_s_data_out-GENRE,
          l_s_data_in-CALMONTH       to l_s_data_out-MOISFERME,
          l_s_data_in-/BIC/ZEPCC0008 to l_s_data_out-TYPPROSPER,
          l_s_data_in-CALYEAR        to l_s_data_out-ANNEE,
          'ABC'                 to l_s_data_out-COD_MVT.
    insert l_s_data_out into table e_t_data_out.
  endloop.
  message i899(rsbo) with 'exit finished' into l_dummy.
  i_r_log->add_sy_message( ).
endmethod.
So now my <u>destination DB ta</u>ble, <b>/BIC/OHZEPC_OH_E</b> (same as structure <b>ZEPC_I_EPCOT_PROSPER</b> above) should have the data inside. 
Can I now just insert a fuction at the end to download the table to flat file???
Thanks,
Tim

Similar Messages

  • I want higher level hierarchy values at open hub output

    Hi,
    InfoObject Profit Centre is having hierarchies.
    I am getting leaf node values of Profit Centre at open hub output.
    Is there any way I can get parent node values of Profit Center at open hub output.

    Hi Prashanth,
    yes the best way for you is to use the DataSource 2LIS_11_V_SSL. You have here the following status fields: LFGSA (Overall delivery status of the item) and LFSTA (Delivery status) and also the quantities like:
    WMENG     Requested Quantity of Sales Order Sched. Lines in Sales Unit
    BMENG     Confirmed Quantity of Sales Order Sched. Lines in Sales Unit
    VSMNG     Delivered Quantity in Sales Unit of Sales Order Sched. Lines
    Only if you need also Open Quantity you have to use a combination of two DataSources in one ODS Object. As example 2LIS_11_V_SCL. Here you will find the field OLFMNG (Open quantity to be delivered).
    You have to consolidate the both in one ODS object with the same key figures and both should update different InfoObjects.
    I hope this helps.
    Best regards,
    Natalia

  • If i got my iPhone 4 from the Apple store (UK) will it be open on all networks or will i need a unlock code from Apple!

    will an iphone 4 be open to all networks if purchased from the applestore (uk) or will i have to get a code from apple or my current provider !

    If the Apple Store sells unlocked iphone then it will be unlocked.
    If they don't, then it won't.
    Ask them if they sell unlocked iphones.  If they do, then buy one.

  • How to create Fixed Length Flat File from Open Hub in BI 7.0

    My requirement is to produce a Fixed length Flat file by Open Hub destination. My Open Hub has four fields. Now the requirement is to create another extra field in Open Hub which will contain all of the four fields value. In addition to that the fields should be fixed length that means if for any field no value is there and the field length/type is CHAR4 then 4 spaces should be there as the field value. SO, basically the Open Hub output will be single field which will contain information of four fields.
    How to get this using End Routine of Transformation (from DSO to Open Hub) ?

    Hi,
    You can map the four input fields to the new field in the Open Hub, and change rule type to "Routine".
    For example, if your source fields are called "first", "second", "third" and "forth", the ABAP routine could be similar to:
    DATA: l_t_1  TYPE C LENGTH 4,
          l_t_2  TYPE C LENGTH 4,
          l_t_3  TYPE C LENGTH 4,
          l_t_4  TYPE C LENGTH 4.
    IF source_fields-first IS INITIAL.
      l_t_1 = '    '.
    else.
      MOVE source_fields-first TO l_t_1.
    endif.
    IF source_fields-second IS INITIAL.
      l_t_2 = '    '.
    else.
      MOVE source_fields-second TO l_t_2.
    endif.
    IF source_fields-third IS INITIAL.
      l_t_3 = '    '.
    else.
      MOVE source_fields-third TO l_t_3.
    endif.
    IF source_fields-forth IS INITIAL.
      l_t_4 = '    '.
    else.
      MOVE source_fields-forth TO l_t_4.
    endif.
    CONCATENATE l_t_1 l_t_2 l_t_3 l_t_4 into result.
    In the example, the program uses four blank spaces if any of the fields has no value. 
    Additionally, if non-initial values in input fields could be shorter than 4 characters (if the input fields have no fixed length), you could use STRLEN to evaluate if it is necessary to add blank spaces to complete the fixed length of 4.
    I hope this helps you.
    Regards,
    Maximiliano

  • Open hub destination ---Additional property needed--

    Hi,
    I have one open hub destination object which is writing data from infocube to an external file (in appliaction server ) and the file is distributed to customers via Email---
    This is absolutely working fine.
    Now, i also need to extract the data from other datasource to the same cube ( which are nothing but corrections to original data existing in cube ) and the corrected data needs to be updated to the file in applucation server!!
    as overwriting is not possible in infocubes i wonder how i could make the corrected data to be updated to the external files

    Hi,
    Try to load DELTA to the cube from the datasources. By this only the changes will be added to the already existing records.
    Else load the data from the cube to a new DSO and the load the data from other datasources to the same DSO. The latest correct data from other datasources will over-erite the existing data in the DSO.
    Use this new DSO in the openHub.
    Regards,
    Balaji V

  • Open hub destination to download the output file into R/3

    Hi Guys,
                 I have a requirement where i need to down load the data from my BW cube to R/3.For this i am using Open hub destination.
    While creating the open hub destination we have option called RFC Destination(in that i have selected destination as my R/3 system).When i execute my DTP it is saying the load is successful and it is showing the no.of records transfered.
    But i am not able to find where this data is getting transfered From BW cube to R/3 system using open hub.

    Hi Friends,
    I have one Query on WDJ. Here I have two requirements.
    I have done click on u201CExport Excelu201D Button that data was download into Excel file Now my requirement is click on Save Button that file will save into (/exchange/CED) this path of R/3 System.
    CED -
    >is R/3 System ID
    Under System id we can save this file.
    Here EP Server and R/3 Servers in Diff Systems.
    I was write codeing
    InputStream text =      null;
         int temp = 0;
         try
         File file =      new File(wdContext.currentContextElement().getResource().getResourceName().toString());
    //       String File = "abc.xls";
    //       String file = "anat" + ".xls";
         wdComponentAPI.getMessageManager().reportSuccess("File::"+file);
         FileOutputStream op =      new FileOutputStream(file);
    //       FileOutputStream op = new FileOutputStream("D://usr//sap//BPE//"+file);
    //       FileOutputStream op = new FileOutputStream("D://KumarDev-BackUp//"+file);
         wdComponentAPI.getMessageManager().reportSuccess("op::"+op);
         if (wdContext.currentContextElement().getResource()!= null)
         text = wdContext.currentContextElement().getResource().read(false);
         while((temp=text.read())!= -1)
         op.write(temp);
         op.flush();
         op.close();
    //        path = file.getAbsolutePath();
         path =      "/exchange/CED/"+file;
         wdComponentAPI.getMessageManager().reportSuccess("path:"+path);
         catch(Exception ex)
              wdComponentAPI.getMessageManager().reportSuccess("ex.printStackTrace()");
    My Req is that file will saved in this path (path =      "/exchange/CED/"+file; )
    Regards
    Vijay Kalluri

  • Need info on OPEN HUB SERVICES

    Hi Experts,
    I have a requirement where i have to extract data from an Standard Sales Overview  infocube(0SD_C03) and store it on a file server.From the file server need to push the file into MATRIX Database on a daily basis(Delta Load).The file needs to follow certain format with the fields based ona selection condition.
    Can anyone advice on how to go about,i have not worked on OPEN HUB Services.I would like to have screenshots and if possible step by step procedure to go with.
    Thanks in advance

    Hi,
        Please go through the simple steps given for creation of Open Hub Destination
    1. In the modeling area choose Open hub destination
    2. In the context meno choose Create Open hub destination
    3. Enter the technical name
    4. In the destination tab, U can select the flat file format .csv (only format supported)
    5. Field destination tab define the properties of the field u want to transfer.
    6 . Activate
    if u want futher details u can refer
    http://help.sap.com/saphelp_nw70/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    regards,
    Md Zubair Sha

  • Need a code for open hub destination

    Hi Experts,
    I have a requirement on open hub destination, as we are working on FI datasources,
    while using open hub destination, it will create two files. one is structure file and another is header file,
    nw my client is asking us to combine these both files into a same file while using open hub destination..
    could you please explain me hw to achieve this situation
    plz share the CODE ...
    Thanks in advance !!!!
    Regards
    sinu reddy

    ghhyh

  • Open Hub, Delta Transformation

    Hi All,
    Thanks in advance for your help.  I am extracting data from a DSO to a flat file using an Open hub and a delta DTP.  The problem I have is that if there is no delta (i.e. zero new/changed records) since the last time the open hub ran, my file is not cleared.
    For example,  If I ran the DTP in the morning and received 100 records, the open hub creates a file in the UNIX directory with a 100 records.  However if in the evening I run the DTP again and there are no new records, the file is cleared, and remains with the 100 records.  This is causing a problem down the line because the data is sent twice.
    Is there anyway that the DTP can output blank files? or can the file be cleared out in a different step in the process chain?
    your help is very appreciated.
    Thanks

    Hi,
    This note should correct your problem: 1111259
    Summary
    Symptom
    You extract data using a Delta data transfer process (DTP) to an open hub destination of type 'File on application server'. If the source does not contain any new data, the existing file is not deleted.
    Reason and Prerequisites
    This problem is caused by a program error.
    Solution
    SAP NetWeaver 7.0 BI
               Import Support Package 17 for SAP NetWeaver 7.0 BI (BI Patch 17 or SAPKW70017) into your BI system. The Support Package is available when Note 1106569 "SAPBINews BI7.0 Support Package 17", which describes this Support Package in more detail, is released for customers.
    In urgent cases, you can implement the correction instructions as an advance correction.
    Hope this helps.
    Regards,
    Diego

  • Open Hub Destination Delta issue

    Hi experts,
    We have a scenario in which a Open hUb is created on top of a cube.
    Regular delats are moving from this cube to Open hub daily.
    But as per the request, we had to dump the cube data, and the reload took, 2 days to complete.
    So, that 2 days, data got missed in Open hub.
    Now, as the new request are not datamarted, next time wen the DTP between Cube and Open Hub runs, it will fetch all the requests right. we dont want that.
    So, can we run a init w/o data transfer so that source status will be fetched?
    but then how to load that 2 days data alone.
    Kindly advise.
    Edited by: neethacj on Mar 7, 2012 1:06 PM

    If the cube doesnt have any other targets aprart from this OHD,
    delete the two days requests in the cube, set the init from cube to ohd
    load the cube for delta( brings two days requests) and run the ohd

  • Open Hub Destination table structure update

    Hi,
         I created a open Hub detsination and activation created the table  and populated the data in the table and have a view over the table.
         I need to add few more fields in the open hub and when I did that and activated again , it is supposed to change the table structure adding the new fields that were added in the open hub.
       The activation was successful but the table structure still stays the same.I didn't see the new fields in the generated table.
       Anyone having similar issues.
      Thanks in advance

    We deleted all the related objects from Staging and production and re transported them from Development as though they are being sent for the first time and that works.Tough this approach is not good we need to use it as I didn't find any other solution

  • Open Hub Destination with Flat File - Structure file is not correct

    Using the new concept for Open Hub destination (NOT an InfoSpoke).
    Output is to a Flat File. Remember that when output is to a flat file 2 files are generated: the output file and a structure file (S_<name of output file>.CSV)
    The Output file structure corresponds to the structure defined in the Open Hub destination.
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    I have not found any settings that I can modify for the S_ file. When using InfoSpokes the S_ file is correct and I do not have to do anything.
    The issue is that the S_ file cannot be used to describe the output file. This is a problem.
    Anyone knows a solution to this issue? The goal is to get the S_ file to reflect the proper structure.
    I could not find any Notes on this. Please do not send general help links, this is a specific question.
    Thank you all for your support.

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • Open Hub Delta Records

    Hi,
    Can we able download Delta Records from ODS / Cube by using Openhub , please some one tell me
    Thanks,
    GAL

    Hi,
    Yes it is possible,
    1)create an Open hub on the top of Cube/DSO and create a Delta DTP and execute
    2)Select the Destination type as CSV file or Database table
    If you select CSV file as Destination type
    1)If you want to download it to desktop or to your local machine, then click on the directory and chose the path.
    2)If you want to download it to Application Directory, then Check the button Application server, you can see the directory path automatically, now the file will be placed in the in that particular directoy.
    If you select Database table as Destination type
    1)Then you can see the database table name on the screen , goto se11 and download the data fomr that partiuclar table

  • Open hub delete delta

    Hello,
    I'm using open hub in BI 7.0. How does the delta queue work for this tool?
    I'm extracting data from a cube to a flat file with delta. I'm storing the flat file on the SAP server for further processing.
    How do I handle re-runs of delta. For example if something goes wrong in the load, how do I re-run the last delta without having to reload the entire content of the source cube?
    Thanks,
    F C

    Hello again,
    But I cannot delete the request from the cube. This is the source of the load. I do not wish to delete the source data, but only the "delta queue" in order to be able to reload. Please elaborate further..
    I have noticed that setting the monitor to red will delete the target file. Furthermore, when executnig the DTP again, alla data is loaded again (and not only the delta).
    Any ideas?
    Regards,
    F C

  • DTP Delta request for open hub destination.

    Hi expert,
    As you know, In BW 7.0, open hub destination is used for open hub service, and you can create tranformation/DTP for the destination. and in some case the DTP delta is enabled.
    my questions are
    1) in the context menu of open hub destination, I did not find something like manage, which could list all the DTP request executed for this open hub destination..but for normal infocube or ods, you can find a list.
    2) if the DTP could enable the delta, where can I delete the latest delta request, and have a new load..
    3) I assume the DTP could be excuted with batch integrated with process chain by process type 'DTP', right ? if the destination type is third party tool, can it also be integrated with process chain ?
    4) can give some obvious difference of InfoSpoke and open hub destination in terms of advantage and disadvantage..
    Thanks a lot for the feedback.
    Best Regards,
    Bin

    Hi ,
    Please look at the links below . hope it helps .
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d063282b-937d-2b10-d9ae-9f93a7932403?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9?QuickLink=index&overridelayout=true
    Regards
    Santosh

Maybe you are looking for

  • Trouble with Snow Leopard

    I tried to update my macbook to Snow leopard and it keeps installing and then restarting and then immediatly installing again. Is there anything I can do or am I screwed?

  • Why does LPX sometimes shorten, cut,

    All, does anyone know why, maddeningly, Logic Pro X sometimes cuts, shortens, and/or converts to 8 bit audio files that then can't be played back? This has been happening to some of my audio tracks (24/44.1 files) that I record into Logic, but it onl

  • Gift Card Fiasco

    Before the holidays, I won a few gift cards in a Holiday party raffle to Best Buy. I was so ecstatic because my boyfriend and I share a birthday that was only a few days later. I gave him these gift cards (totaling $600) on our birthday, and he was o

  • Can ANYBODY use spotlight search on an Airport Disk?

    Seriously does this work for anybody?

  • Question about RSCRM_BAPI

    Hi guru's, Can we transport to other systems the query views that we create on this transaction ? And the Extracts ? Or we need to create them for every system. Thanks in Advance.