To create an header for Open Hub extract

Hi guys,
How do i get an header for an open hub extract for a CSV file, an header consisting of the field names technical or otherwise.
Thanks,
Your help will be greatly appreciated

hi,
Field definition
On the Field Definition tab page you define the properties of the fields that you want to transfer.
We recommend that you use a template as a basis when you create the open hub destination. The template should be the object from which you want to update the data. This ensures that all the fields of the template are available as fields for the open hub destination. You can edit the field list by removing or adding fields. You can also change the properties of these fields.
You have the following options for adding new fields:
●     You enter field names and field properties, independent of a template.
●     You select an InfoObject from the Template InfoObject column. The properties of the InfoObject are transferred into the rows.
●     You choose  Select Template Fields. A list of fields are available as fields for the open hub destination that are not contained in the current field list. You transfer a field to the field list by double-clicking on it. This allows you to transfer fields that had been deleted back into the field list.
If you want to define the properties of a field so that they are different from the properties of the template InfoObject, delete the template InfoObject entries for the corresponding field and change the properties of the field. If there is a reference to a template InfoObject, the field properties are always transferred from this InfoObject.
The file or database table that is generated from the open hub destination is made up of the fields and their properties and not the template InfoObjects of the fields.
If the template for the open hub destination is a DataSource, field SOURSYSTEM is automatically added to the field list with reference to InfoObject 0SOURSYSTEM. This field is required if data from heterogeneous source systems is being written to the same database table. The data transfer process inserts the source system ID that is relevant for the connected DataSource. You can delete this field if it is not needed.
If you have selected Database Table as the destination and Semantic Key as the property, the field list gets an additional column in which you can define the key fields for the semantic key.
In the Format column, you can specify whether you want to transfer the data in the internal or external format. For example, if you choose External Format here, leading zeros will be removed from a field that has an ALPHA conversion routine when the data is written to the file or database table.
Assign points if helpful
Cheers!!
Aparna

Similar Messages

  • DTP Delta request for open hub destination.

    Hi expert,
    As you know, In BW 7.0, open hub destination is used for open hub service, and you can create tranformation/DTP for the destination. and in some case the DTP delta is enabled.
    my questions are
    1) in the context menu of open hub destination, I did not find something like manage, which could list all the DTP request executed for this open hub destination..but for normal infocube or ods, you can find a list.
    2) if the DTP could enable the delta, where can I delete the latest delta request, and have a new load..
    3) I assume the DTP could be excuted with batch integrated with process chain by process type 'DTP', right ? if the destination type is third party tool, can it also be integrated with process chain ?
    4) can give some obvious difference of InfoSpoke and open hub destination in terms of advantage and disadvantage..
    Thanks a lot for the feedback.
    Best Regards,
    Bin

    Hi ,
    Please look at the links below . hope it helps .
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/d063282b-937d-2b10-d9ae-9f93a7932403?QuickLink=index&overridelayout=true
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e830a690-0201-0010-ac86-9689620a8bc9?QuickLink=index&overridelayout=true
    Regards
    Santosh

  • Modifying destination Directory of Application server for Open Hub

    Hi All,
    i want to load a csv file on application server using Open Hub.
    when i have created the open hub i have specified the server name, the file name and the directory.
    but the Basis guys told me that i should use another directory, they have created me a Unix Directory but the problem that i can't modify the directory path in my open hub.
    can you tell me please what is the problem?
    another question why i can't modify directly in Production System this parameter of server name, directory for open hub,
    in production i can create the open hub but when i have transported some one from dev to prod i can't modify in prod, i have controlled from transport tool the possibility of modication object and i see that for opne hub is "original modificable"
    Thanks for your help
    Bilal

    i have read in some forum that to " to change the server name or logical file name we can do so in the following table:
    RSBFILE"
    i tried to see the data of this table and i found 2 records for my open hub, 1 for Active version and 1 for Modified version
    i tried here to change the path name of the directory and after saving the system give me the message that the active version and the modified version of my open hub are not equal, when i try to activate again my open hub the system take the old
    directory path.
    is this is the right table to change the directory path on application server of my open hub? or there is another table?
    i don't work with logical filename, i work with file name and directory path name.
    thanks for your help
    Bilal

  • Transformations on Datasource for Open Hub Services

    Dear all,
         I have created a Generic Datasource on a transparent table. Following which  have created an Open hub service on this datasource. Now am unable to create the thansformations on this. I encountered the following error "Cannot connect DataSource to an open hub destination ". 
    Please let me know whether it is possible to generate Transformations for Open hubs on Datasource.
    If not, let me know any procedure to populate data from generic datasource.
    Kind Regards,
    Sunitha

    Hi,
    connection with an datasource is not possible. But, try to connect the datasource to with an infosource and use the infosource as a source for the open hub. maybe this works.
    cheers
    Juergen

  • Creating a header for a .CSV file

    Hi,
    I have looked through the forums and cannot find a solution for creating a header for a csv file.  I am using Labview 8.2.  I want a label for each column at the top of the file and then I will append new rows to the file as the data is collected.  An example of what I am looking for is attached.
    It would also be nice to have the labels descending in the first column, pretty much a transposed version of what was described above. 
    Thanks,
    Gary 
    Solved!
    Go to Solution.
    Attachments:
    exampleLOG.csv ‏1 KB

    Thank you very much.  That worked well. 
    If i wanted to transpose the data how would I do that?  I can get the header to be vertical, but I cant get the data to append to the 2nd column , and then the third and so on with the data descending from top to bottom.  I attached an example of what I might want the file to look like. Each column would be added one at a time.
    Attachments:
    data3.csv ‏1 KB

  • Multiprovider as Datasource for Open Hub Destination

    Hello,
    Can you pls let me know if we can used the Multiprovider as datasource for the Open Hub destination. Currently we are Bi7.0 and SP9. I am not able to see the option of Mutiprovider in the template.
    Appreciate the input ASAP.
    Thanks
    Gopal

    Multiprovider is not yet supported for Open Hub destination and hence you dont see that option.
    As of SAP NetWeaver 2004s SPS 6, the open hub destination has its own maintenance interface and can be connected to the data transfer process as an independent object. As a result, all data transfer process services for the open hub destination can be used.  You can now select an open hub destination as a target in a data transfer process.  In this way, the data is transformed as are all other BI objects.
    In addition to the InfoCube, InfoObject, and DataStore object, you can also use DataSources and InfoSources as templates for the field definitions of the open hub destination.
    The open hub destination now has its own tree under Modeling in the Data Warehousing Workbench. This tree is structured by InfoAreas.
    <b>Restrictions -</b>
    It is not currently possible to use MultiProviders as the source for the open hub destination.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/7c836c907c7103e10000000a1553f7/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Open Hub extraction for Hierarchies

    Hello,
    I need to create a Open Hub on the 0MAST_CCTR in HR-PA. Unfortunately the Open Hub does not give you the option for hierarchies. Any ideas?
    Thanks,

    Hi,
      If it is a flat file output then you can use the below program mentioned in SAP how to document. We are using this in our system.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/0403a990-0201-0010-38b3-e1fc442848cb?quicklink=index&overridelayout=true
    Regards,
    Raghavendra.

  • Dynamic File name for Open Hub file

    Hi All,
    I wanted to create a .csv file using open hub destination. I wanted the name of the file to be dynamic based on month. for example in jan I wanted it as 01.Dump for January and 02.Dump for february.
    is is possible at all to do that.I wanted to save the file in the application server. There should not be any manual intervention the process.
    Thanks in advance.
    Regds
    Raghu

    Hi
    We have take month from the first record that we read.
    Our requirement is to take one months data and give it in a file with a file name having the corresponding month.
    In the code mentioned below MOC_CODE is the field that is used for this purpose.
    Hope it helps.
    Regards,
    Raghu
    code:
    #!/bin/sh
    Script to dump file.
    Parameters.
    $1 Division name to extract from the data file.
    It re-arranges the columns.
    It replaces values in strings.
    Parameters.
    DATA_FILE_NAME="data_file.csv"
    HEADER_FILE_NAME="header_file.csv"
    DIVISION_NAME="$1"
    Re-arrange header file columns.
    Transpose rows to columns comma separated.
    HEADER_ROW=`cat "$
    Re-arrange header columns.
    Re-direct the output to a temporary file.
    " | tr '\n' ',' | sed 's/,$//g'`
    echo "$" | awk 'BEGIN { FS = ","; OFS = "," } { print $2, $3, $1, $4, $5, $6, $7 }' > "$.$.tmp"
    Prepare awk program.
    Re-arrange data file columns.
    Replace UNI in 6th column with UNIT and Replace TON in 6th column with TONS.
    Filter division rows.
    AWK_PROGRAM="BEGIN { FS = \",\"; OFS = \",\"; TVAL = 0 } \$5 ~ /$/ { sub( \"UNI\", \"UNIT\", \$6 ); sub( \"TON\", \"TONS\", \$6 ); print \$2, \$3, \$1, \$4, \$5, \$6, \$7; TVAL = TVAL + \$7 } END { printf \"$ Total Value: %f\", TVAL 2> \"$$Re-direct the program to a temporary program file.Control.ctl\" }"
    echo "$" > "$.awk"
    Execute the data file formatting command.
    awk -f "$.awk" "$" > "$.$
    Get generation date.
    Get moc code from data file.
    .tmp"
    GENERATION_TIME=`date +"%Y%m%d_%H%M%S"`
    MOC_CODE=`head -1 $ | awk -F, '{ print $3 }'`
    Prepare dump file.
    DUMP_FILE_NAME="$_$_$_$"
    cat "$.$.tmp" "$.$.tmp" > "$
    Remove temporary files.
    FTP dump file to remote server.
    FTP.
    rm *.tmp
    REMOTE_SERVER="<replace with server ip address>"
    REMOTE_USER="xxxx"
    REMOTE_PWD="xxxx"
    ftp -n "$" << EOF
         quote USER "$"
         quote PASS "$"
         ascii
         put "$"
         bye
    EOF
    Exit.
    exit 0

  • Inconsistency with ZPMSENDSTATUS for Open Hub Destination

    Hi Experts,
    I dont know you have come accross this scenario but I will apprecaite all your advice.
    We have APO 7.0 and we are using the BW component to extract data to informatica.
    Our environment is as follow
    SAP SCM 7.0 SP8
    Informatica PWCD 8.1.6 hot fix 13
    1.we have created 5 chains with similar variants for ZPMSENDSTATUS
    2. Each variant relates to a folder and workflow to be executed within informatica
    3. When we execute the chains only two work correctly all the time and 3 fail
    the 3 that are failing are failing within the ZPMSENDSTATUS when running API RSB_API_OHS_DEST_SETPARAMS. This is odd since the other two only differ on the workflow to be executed and obviously on the Open Hub destination. Can anyone provide some advice on this?
    thanks,
    Nat

    Hi Maruthi,
    thanks for your response but here are the details for the variants that are working and failining
    working:
    DEST: < here we have the RFC that relates to the logical system and the program id which is also included in saprfc.ini>
    INFPARA: AERO/APO:WF_SAP_APO_PART_PLATFORM_ABP2_BGA:S_M_SAP_APO_PART_PLATFORM_ABP2
    CONTEXT: OHS API
    OHDEST: <OHDS defined as target>
    Failing:
    DEST: < here we have the RFC that relates to the logical system and the program id which is also included in saprfc.ini>
    INFPARA: AERO/APO:WF_SAP_APO_PART_PLATFORM_ABP4_ATR:S_M_SAP_APO_PART_PLATFORM_ABP4
    CONTEXT: OHS API
    OHDEST: <OHDS defined as target>
    So you see, they all target the same folder only different workflows. Still it works for the first and not for the second.

  • Sample Code in BADI for Open Hub Services

    Hi all,
            We have a requirement to extend(add few new fields and  populate them using BADI) for the target structure of Infospoke in Open hub services.
    I have 2 issues which are stopping us to proceed further.
           1.   I did this by just adding new fields in this change mode of target Structure (didn't used 'Append structure' as while saving and creating T.port request,its popping error like 'Structure not present in TRDIR).I am not able to assign this to any T.request,as of now I have saved it as Local object.How can I assign this to an request?is it possible?.
           2. After putting breakpoint in the BADI I don't see any data in the Importing table.I need some sample code in the BADI which should populate the New fields in the target structure.
    It will be great if anyone of you will give us any solution for the same.
    Thanks,
    Rahul.
    Edited by: Rahul Siddi on Oct 12, 2009 3:04 PM

    Hello Rahul,
    Find the code below with the steps to be implemented.
    Enter your infospoke in the edit mode.
    - On the Transformation tab set the indicator for the Infospoke with Transformation with BADI so that the infospoke is activated.
    - This will take you to the Addin implementation/BADI builder.
    - Enter the short text/description for the implementation. The implementation name is always the same as the technical name of the infospoke
    - The implementation of the BADI is always filter dependant.
    - In the properties tab of the infospoke enter your infospoke under the Filter specifications.
    If you do not specify an InfoSpoke under Filter Specifications, then this implementation is valid for all InfoSpokes. This means that this is called up for all InfoSpokes during the extraction.
    - Activate your class
    - From your interface tab page, double click on the Transofrm Method and you will arrive in the class builder page
    - Here you can enter the code
    - To do a look up of the master data you have to write a code similar to the one I've given below. This is just an example for looking up material master.
    IF FLT_VAL = 'Your infospoke'.
    T_DATA_IN] = I_T_DATA_IN[.
    Select zstd_cost from /bi0/pmaterial into table T_return
    For all entries in T_DATA_IN
    WHERE material = T_DATA_IN-material.
    ...Continue with your code.
    Append output from T_return to your output E_T_DATA_OUT
    - Activate your method. Return to the BAdI builder. Return to your InfoSpoke.
    Check if you missed any of these...
    Kris...

  • Problem with a Open Hub Extraction

    Hi Experts,
    I have a huge problem here.
    I´m trying  to extract some data with an Open Hub application, but i´m having trouble with the application server.
    See, I have to extract the information from a DSO to a file in a application server, but the company has one application server for each environment (BD1, BQ1, etc) and the conversion it´s not working and the transportation of the request always fails.
    Anyone knows how to fix this??
    Thanks.
    Regards.
    Eduardo C. M. Gonçalves

    Hi,
    While creating the open hub, you need to maintain the file name details under T-CODE - FILE.
    I hope you have maintained that.
    There also you wil have to use the variable to change the file path which will change the file path in each system..
    There you will have to define
    Logical File Path Definition
    Assignment of Physical Paths to Logical Path
    Logical File Name Definition, Cross-Client
    Once you have define this and transport , then your Open Hub will go smooth.
    Thanks
    Mayank

  • How to create a report for open sales orde documents which are not invoiced

    Hi Experts this is urgent,
    +pls give the Logic for document flow+
    My requirement is create a report for sales orders which are not invoiced  using the following table.
    VBAK : sales order header
    VBAP : sales order item
    VBFA : sales document flow
    VBUK for processing status
    KOMV for duties value and sales order value
    LIKP : delivery not header
    LIPS :delivery note item
    For information : In the header level the processing Status is indicated in the table VBUK field LFSTK for one sales order number. A,B , C are the possible entries.
    Case A : When a sales order is invoiced we can display information on the header status :
    Overall status : Completed  and display a invoice number in the document flow. When the items of the sales orders are invoiced the process status is the following :  Overall status       Completed            
    Delivery status      Fully delivered      
    Case B : An open sales order not delivered and not invoiced will have overall status : Open on the header and item level and will not have subsequent documents.
    Case C :
    When the items for the sales order are delivered but not invoiced the status will be u201Cfully deliveredu201D
    And the subsequent documents will be delivery notes and good issue if the delivery note is issued.
    With regards
    ravi
    Edited by: ravik ravik on Jun 25, 2008 3:29 PM

    Hello Ravi,
    U neednot develop any report..
    there is std report with txn V.02
    or copy this and make necessary changes.
    Reward, if helpful.
    Rgds,
    Raghu.

  • How to create complecated Header for JTable in a JApplet ?

    I have one JTable in a JApplet.
    I need to create a complecated header for the table. For that I have used a downloaded code in the URL http://www2.gol.com/users/tame/swing/examples/JTableExamples1.html (First example).
    Instead of "GroupableHeaderExample.java", I have used my code "StatusReportPopupMenu.java".
    The program runs properly in appletviewer. But, when I try to open the same in IE 5.5, first it shows the message "Loading Java Applet" . Then it goes blank. The status bar shows the message: Start: applet not initialized". At the Java console of the browser, I could see the Exception as given below:
    java.lang.NullPointerException
         at GroupableTableHeaderUI.getHeaderHeight(GroupableTableHeaderUI.java:97)
         at GroupableTableHeaderUI.createHeaderSize(GroupableTableHeaderUI.java:118)
         at GroupableTableHeaderUI.getPreferredSize(GroupableTableHeaderUI.java:128)
         at javax.swing.JComponent.getPreferredSize(Unknown Source)
         at javax.swing.ViewportLayout.preferredLayoutSize(Unknown Source)
         at java.awt.Container.preferredSize(Unknown Source)
         at java.awt.Container.getPreferredSize(Unknown Source)
         at javax.swing.JComponent.getPreferredSize(Unknown Source)
         at javax.swing.ScrollPaneLayout.layoutContainer(Unknown Source)
         at java.awt.Container.layout(Unknown Source)
         at java.awt.Container.doLayout(Unknown Source)
         at java.awt.Container.validateTree(Unknown Source)
         at java.awt.Container.validateTree(Unknown Source)
         at java.awt.Container.validateTree(Unknown Source)
         at java.awt.Container.validateTree(Unknown Source)
         at java.awt.Container.validateTree(Unknown Source)
         at java.awt.Container.validateTree(Unknown Source)
         at java.awt.Container.validate(Unknown Source)
         at sun.plugin.AppletViewer$AppletEventListener.appletStateChanged(Unknown Source)
         at sun.applet.AppletPanel.dispatchAppletEvent(Unknown Source)
         at sun.applet.AppletPanel.appletResize(Unknown Source)
         at java.applet.Applet.resize(Unknown Source)
         at java.applet.Applet.resize(Unknown Source)
         at java.awt.Component.setSize(Unknown Source)
         at sun.plugin.AppletViewer$AppletEventListener.appletStateChanged(Unknown Source)
         at sun.applet.AppletPanel.dispatchAppletEvent(Unknown Source)
         at sun.applet.AppletPanel.appletResize(Unknown Source)
         at java.applet.Applet.resize(Unknown Source)
         at java.awt.Component.setSize(Unknown Source)
         at StatusReportPopupMenu.init(StatusReportPopupMenu.java:245)
         at sun.applet.AppletPanel.run(Unknown Source)
         at java.lang.Thread.run(Unknown Source)
    I'm not able to figureout what should I do to correct the problem.
    Please help. This is very urgent.

    Hi,
    1. For this we have to use the
    authorisation object
    P_ORGIN
    2. It has got the following fields, on which authorisations can be controlled.
    AUTHC Authorization level
    INFTY Infotype
    PERSA Personnel Area
    PERSG Employee Group
    PERSK Employee Subgroup
    SUBTY Subtype
    VDSK1 Organizational Key
    Regards,
    Harish

  • Need a code for open hub destination

    Hi Experts,
    I have a requirement on open hub destination, as we are working on FI datasources,
    while using open hub destination, it will create two files. one is structure file and another is header file,
    nw my client is asking us to combine these both files into a same file while using open hub destination..
    could you please explain me hw to achieve this situation
    plz share the CODE ...
    Thanks in advance !!!!
    Regards
    sinu reddy

    ghhyh

  • Open Hub Extraction

    Hello
    I am doing a open hub and sending data to a file. Now in the file i want to maitain the header line(description of field) for the fields i am sending to the file.  How can i do that. Please can someone explain me a little bit in brief.
    thanks

    Hi,
    Once u run the Infospoke, it will generates two files, data file and schema file. The data file contains the data and the schema file contains the deader or the structure of the Infospoke. So if u want to maintain the header in the data file, u just copy the list of InfoObjects name and paste it on the header file.
    Thanks & Regards
    Ramakrishna Kamurthy

Maybe you are looking for

  • I need autocomplete  for search for words in a txt. file

    i am not so good in java. I have a running code for search in text with a txt. file (from user bluefox815). But I need a solution with autocomplete for search for words in a txt. file. test_file.txt (Teil des Inhaltes): Roboter robots Mechatronik mec

  • Getting runtime error while using hash table

    Hi, I have defined an internal table as hash with unique key.But while executng the prog. its giving a dump saying "There is already a line with the same key." My code is   data: begin of wa_rkrp,          vbeln like vbrk-vbeln,          fkdat like v

  • OBIEE11.1.1.6 OCI

    Hi, I'm using BIEE11.1.1.6, and I use OCI to import data, now my problem is I can import table, but when I view data ,it will warning me 'The connection has failed.' I have set ORACLE_HOME and TNS_ADMIN, put the tnsnames.ora under %TNS_ADMIN%. I thin

  • Scrolling/Sliding content in flash

    Hi, I'm looking for tutorials that  show how to create scrolling content in flash, but instead of an auto  scroll, I would like to click and drag the content. Similar to the link below. http://www.thisisgrow.com/archive/photosmart/en_us/#/products. T

  • Display bookmarks as image previews

    I don't suppose anybody knows if there's a way (or perhaps an extension) to display my bookmarks like image thumbnails instead of just text descriptions? It'd be much more helpful. I don't mean showing them as favicons, I mean a preview of the home p