Open Hub and Informatica

Hello All,
We are facing a situation, we need to extract data from SAP BW  and load it in informatica,
We have created a openhub.
The data in the targets are large , we are able to load the data into open hub quickly ( 6million records in 15mins), but when the data is transferred in to informatica the time taken is huge ( 30-45 rows/sec) when we ran a trace we couldn't find any reason why the data is extractin so slow.
Request you to please let me know what all checks can I do to find the bottleneck and what kind performace improvement I should think of doing.
Regards,
Ravi

Hi Kumar,
Informatica has a connector for SAP BW, integrating as a 3rd Party RFC destination via open Hub.
In the process chain to load the open hub you also need to include a custom ABAP program (distributed via the Informatica DVDs), which is mainly of event logging and informing the listener service on the Powercenter side when the load to the open hub destination table is finished.
We will be using the same method, although we're just in blueprinting at the moment and haven't doe a POC yet. I've got a document, contact me for more info.
Regards
Steffen

Similar Messages

  • Deleting DTP based on Open Hub and Remote Cube

    Hi All,
    i have created a DTP delta for my Open Hub to load a flat file from a remote cube.
    i have done some load test request but it doesn't work (no data in my file .csv), i have created another full DTP with others parameters and it works successfully.
    now i want to delete my Delta DTP but the system return me a message like this:
    DTP_4IDT4BXV29E1N0MWINKCWO60B cannot be deleted at the moment (see long text)
    and in the long text i have:
    Message no. RSBK037
    Diagnosis
    You want to delete a delta DTP that has been successfully used to load requests from the source into the target. If you delete the DTP, you will also delete the information on the source data that was successfully transferred. As a result the source data would be transferred again, if you create a new delta DTP that links the same source with the same target.
    System Response
    The system terminated the delete operation.
    Procedure
    You have two options:
    1. Delete all the requests loaded with this DTP from the target and then delete the DTP.
    2. Do not delete the DTP and continue to load deltas using this DTP.
    i tried to see the requests loaded with the delta dtp and i have deleted one but there are another requests that i can't deleted.
    i have deleted the delta dtp from my transport request.
    how should i do to delete definitivily my delta dtp?
    thanks for your help
    Bilal

    Do not delete entries out of table RSBKREQUEST.
    To delete your DTP, you may use program RSBKDTPDELETE.  Give the technical id of the DTP as the input parameter.

  • Open Hub and Reporting

    In Open Hub we send the data from Data Target to the Excel Sheet and in reporting also we generate the report in excel sheet.what is the difference between the two.
    Regards,
    Chandu.

    Hi Eswar,
    Welcome to SDN !!!
    It is the open hub service that enables you to distribute data from an SAP BW system into external data marts, analytical applications, and other applications. With this, you can ensure controlled distribution using several systems. The central object for the export of data is the InfoSpoke, whereas, when you execute report you get the output in Excel but you cannot make any changes to the data but it can be used to analyze.
    Hope it helps...
    Best Regards,
    DMK
    *Assign points if it helps...

  • Open Hub and Infospoke

    Hi,
    Is there any difference between OpenHub and Info spoke.
    thnx

    Hi,
    Checkout the below links
    infospoke and open hub
    What is Infospoke & Open hub service
    Regards
    Sajeed

  • Open Hub and Info Spoke

    Hi all,
    We have a bunch of Info Spokes created in our production and we want to delete them based on last used date.
    Can anyone tell me in which table in BW are the Info Spokes stored
    Also like aggregates where we can find out last used is there any tables where we can see when was the Info Spoke last used. I know we can check this at Monitor but when we are talking about hundreds of Info Spokes it can be very tedious to check from monitor
    would really appreciate your answers
    Surya

    Hy surya,
    so far I know, the useability will not be recorded in the database. Eventually you may find some info in the InfoCube for technical content, but your question gives me the feeling, that you have a modelling problem, because normally if you have your infoSpokes in process chains you better track the useability. Further, When having so much InfoSpokes as you told, could be that most queries are similar?
    Anyway, if you use a file as target, you can see the last creation date. I hope you can follow the relation between InfoSpoke and the target.
    Cheers
    Paul

  • Open hub and Infospokes

    Hi ,
    can anyone explain me  the selection parameters of infospoke in production system .
    is there any FM developed by SAP ??
    please let me know how can i proceed
    Maid

    Hi
    You can do that in ABAP code but i am not sure if there is any FM related to it
    Step1
    Create a TVARV variable similar to the selection parameter of your Info Spoke.
    Step2
    Code an ABAP program so as to pick the value of the TVARV variable and change the values of the parameter in the table RSBSPOKESELSET. This is the table which stores the selection set based on which you want to restrict your results.
    Step3
    Once you execute the program, the selection parameters would be changed.
    Step4
    Now, you execute your Info Spoke. It would have the new selections maintained by you in the TVARV variable.
    Try this and do let me know
    Santosh

  • Reg. InfoSpokes and Open Hub

    Hi
    What is Open Hub and Info Spoke.
    And also the difference.
    regards
    Sridhar
    [email protected]

    Hi
                 u can use the DTP and transformations to load this data to Openhubservice from BI system
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.

  • Difference between INFOSPOKE  and OPEN HUB DESTINATION

    HAI,
      i want the  difference between INFOSPOKE  and OPEN HUB DESTINATION in detailed....
      thank you
    @jay

    Hi
    See this
    Re: Open Hub and Infospoke
    Bye
    N Ganesh

  • Open hub destination issue

    Hi,
    In our project we have Client Instance (eg: BWPD) and Application server (eg: BWPRD) defined for load balancing.
    We have created open hub and assigned destination server BWPD.
    When i execute the DTP manually in BWPD, it runs succesfully.
    However, the same DTP when placed in the process chain fails with error message :
    No Such File or Directory
    Could not open file D:\usr\sap\BWPD\A01\work\Material on application server
    Error while updating to target ZXXXX.
    Options Tried:
    Schedule process chain in the background server BWPD (same server which has been mentioned in Open hub dest) still DTP failed.
    Tried with Application server it failed.
    Tried with HOST as option it failed.
    couldn't make out what is going wrong. Any thoughts ?
    Regards.

    Hi there,
    found that doc quite useful.
    Maybe could shed some light to your issue.
    [Creating  Open Hub Destination  using a Logical file to extract the data|Creating  Open Hub Destination  using a Logical file to extract the data]
    Also, what OS do you have?
    is the Syntax Group accordingly created ?

  • I am getting an dump while executing open hub destination?

    I did happened to delete a filed in the open hub and then activated and re-ran it i am getting the following error!!
    error  analysis
    You attempted to assign a field to a typed field symbol,
    but the field does not have the required type.
    open hub is active, dtp is active and transformations is active too but still the same!!
    Can anyone advise me on this!!
    thanks
    pooja

    Hi,
    See in
    http://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
    Open Hub Destination
    Open Hub Destination: Part 1
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%201.pdf
    Open Hub Destination: Part 2
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/m-o/open%20hub%20destination%3a%20part%202.pdf
    Analysis Process Designer (APD)
    Analysis Process Designer (APD): Part - 1
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f06353dd-1fe3-2c10-7197-dd1a2ed3893e?quicklink=index&overridelayout=true 
    Analysis Process Designer (APD): Part - 2
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/a-c/analysis%20process%20designer%20(APD)%20Part%20-%202.pdf 
    Analysis Process Designer (APD): Part - 3
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/a-c/analysis%20process%20designer%20(APD)%3a%20Part%20-%203.pdf
    Thanks
    Reddy

  • Problem connecting to SAP Open Hub

    Hi, I am trying to set up a SSIS job  connecting to SAP Open Hub and have with support from the SAP guys been able to get some progress, but it has now stopped up on a error message we're not able to solve. Any suggestion on what can be wrong and
    how to solve this? When I run the package I get the following error message:
    SSIS package "D:\Source\MSBI\SapLoadStaging\Package3.dtsx" starting.
    Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
    Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
    Information: 0x40043006 at Data Flow Task, SSIS.Pipeline: Prepare for Execute phase is beginning.
    Information: 0x40043007 at Data Flow Task, SSIS.Pipeline: Pre-Execute phase is beginning.
    Information: 0x4004300C at Data Flow Task, SSIS.Pipeline: Execute phase is beginning.
    Information: 0x3E8 at Data Flow Task, SAP BW Source: Process Start Process, variant has status Completed (instance DH88PUV2SZBIFKMIF48K3USME)
    Error: 0x3E8 at Data Flow Task, SAP BW Source: Process Data Transfer Process, variant /CPMB/HMIJYDZ -> ZOH_VPL has status Ended with errors (instance DTPR_DH88PUV2SZCA46Y9QNO66A6W6)
    Error: 0x3E8 at Data Flow Task, SAP BW Source: The component is stopping because the Request ID is "0".
    Error: 0x3E8 at Data Flow Task, SAP BW Source: No data was received.
    Error: 0xC0047062 at Data Flow Task, SAP BW Source [41]: System.Exception: No data was received.
       at Microsoft.SqlServer.Dts.SapBw.Components.SapBwSourceOHS.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
       at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)
    Error: 0xC0047038 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on SAP BW Source returned error code 0x80131500.  The component returned a failure code when the pipeline engine called PrimeOutput().
    The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    Information: 0x40043008 at Data Flow Task, SSIS.Pipeline: Post Execute phase is beginning.
    Information: 0x4004300B at Data Flow Task, SSIS.Pipeline: "OLE DB Destination" wrote 0 rows.
    Information: 0x40043009 at Data Flow Task, SSIS.Pipeline: Cleanup phase is beginning.
    Task failed: Data Flow Task
    Warning: 0x80019002 at Package3: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (5) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches
    the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
    SSIS package "D:\Source\MSBI\SapLoadStaging\Package3.dtsx" finished: Failure.
    The program '[6916] DtsDebugHost.exe: DTS' has exited with code 0 (0x0)
    Regards
    Paal

    Hi Paleri,
    According to the
    thread which has the same error message, the issue may be caused by incorrect RCF settings. Could you double check your RCF connection configurations such as DNS settings?
    If it is not the case, please also make sure you have installed the correct version of Microsoft Connector for SAP BW.
    Reference:
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/e2fbafe5-d9df-490a-bfad-3d4b9784a8ea/sap-bi-connector-for-ssis-2008?forum=sqlintegrationservices
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Help~open hub destination file is not open error!

    Hi all,
    I met a problem, i tried several times and all got error messages.
    It is about open hub destination, i want to use it to send some files to sever, but when i execute the DTP, i got system dump:
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Runtime Errors         DATASET_NOT_OPEN
    Except.                CX_SY_FILE_OPEN_MODE
    Date and Time          26.10.2011 01:18:09
    Short text
        File "I:\usr\sap\Q26\DVEBMGS11\work\ZSS_O0021.CSV" is not open.
    What happened?
        Error in the ABAP Application Program
        The current ABAP program "CL_RSB_FILE_APPLSRV===========CP" had to
         terminated because it has
        come across a statement that unfortunately cannot be executed.
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    I have two open hub, one is fine ,but the rest one always get this error. Help..........

    Thanks! I checked my ID and found that i have the role to access this folder.
    Then i found that our Q system have two servers, A and B, for open hub i should run my process chain in system B, but i forgot that, so i got error. After i changed to sever B, i re-active my open hub and DTP, then re-fresh the process chain and re-executed it again, the error disappeared.

  • How to create Fixed Length Flat File from Open Hub in BI 7.0

    My requirement is to produce a Fixed length Flat file by Open Hub destination. My Open Hub has four fields. Now the requirement is to create another extra field in Open Hub which will contain all of the four fields value. In addition to that the fields should be fixed length that means if for any field no value is there and the field length/type is CHAR4 then 4 spaces should be there as the field value. SO, basically the Open Hub output will be single field which will contain information of four fields.
    How to get this using End Routine of Transformation (from DSO to Open Hub) ?

    Hi,
    You can map the four input fields to the new field in the Open Hub, and change rule type to "Routine".
    For example, if your source fields are called "first", "second", "third" and "forth", the ABAP routine could be similar to:
    DATA: l_t_1  TYPE C LENGTH 4,
          l_t_2  TYPE C LENGTH 4,
          l_t_3  TYPE C LENGTH 4,
          l_t_4  TYPE C LENGTH 4.
    IF source_fields-first IS INITIAL.
      l_t_1 = '    '.
    else.
      MOVE source_fields-first TO l_t_1.
    endif.
    IF source_fields-second IS INITIAL.
      l_t_2 = '    '.
    else.
      MOVE source_fields-second TO l_t_2.
    endif.
    IF source_fields-third IS INITIAL.
      l_t_3 = '    '.
    else.
      MOVE source_fields-third TO l_t_3.
    endif.
    IF source_fields-forth IS INITIAL.
      l_t_4 = '    '.
    else.
      MOVE source_fields-forth TO l_t_4.
    endif.
    CONCATENATE l_t_1 l_t_2 l_t_3 l_t_4 into result.
    In the example, the program uses four blank spaces if any of the fields has no value. 
    Additionally, if non-initial values in input fields could be shorter than 4 characters (if the input fields have no fixed length), you could use STRLEN to evaluate if it is necessary to add blank spaces to complete the fixed length of 4.
    I hope this helps you.
    Regards,
    Maximiliano

  • Open Hub, Delta Transformation

    Hi All,
    Thanks in advance for your help.  I am extracting data from a DSO to a flat file using an Open hub and a delta DTP.  The problem I have is that if there is no delta (i.e. zero new/changed records) since the last time the open hub ran, my file is not cleared.
    For example,  If I ran the DTP in the morning and received 100 records, the open hub creates a file in the UNIX directory with a 100 records.  However if in the evening I run the DTP again and there are no new records, the file is cleared, and remains with the 100 records.  This is causing a problem down the line because the data is sent twice.
    Is there anyway that the DTP can output blank files? or can the file be cleared out in a different step in the process chain?
    your help is very appreciated.
    Thanks

    Hi,
    This note should correct your problem: 1111259
    Summary
    Symptom
    You extract data using a Delta data transfer process (DTP) to an open hub destination of type 'File on application server'. If the source does not contain any new data, the existing file is not deleted.
    Reason and Prerequisites
    This problem is caused by a program error.
    Solution
    SAP NetWeaver 7.0 BI
               Import Support Package 17 for SAP NetWeaver 7.0 BI (BI Patch 17 or SAPKW70017) into your BI system. The Support Package is available when Note 1106569 "SAPBINews BI7.0 Support Package 17", which describes this Support Package in more detail, is released for customers.
    In urgent cases, you can implement the correction instructions as an advance correction.
    Hope this helps.
    Regards,
    Diego

  • Open Hub Extraction

    Hello
    I am doing a open hub and sending data to a file. Now in the file i want to maitain the header line(description of field) for the fields i am sending to the file.  How can i do that. Please can someone explain me a little bit in brief.
    thanks

    Hi,
    Once u run the Infospoke, it will generates two files, data file and schema file. The data file contains the data and the schema file contains the deader or the structure of the Infospoke. So if u want to maintain the header in the data file, u just copy the list of InfoObjects name and paste it on the header file.
    Thanks & Regards
    Ramakrishna Kamurthy

Maybe you are looking for

  • How to open and store a data to excel file usig java

    you hava a data in database .you can reterive the data and tranformed to excel file.you can save as well as poen it possible or not?

  • Export to Word with LayoutDirection of RTL on Tablix

    When exporting to Word from SSRS 2008 R2 - setting the LayoutDirection of a Tablix to RTL causes the rendered table in word to appear in inverted order: Selecting the table and setting Table direction (in the Table Properties dialog) to Right-to-left

  • Organizing albums with same artist and album name

    This has bothered me for some time. I have Bob Dylan's 1973 album titled "Dylan," as well as his 2-disc greatest hits of the same name.  iTunes has merged the albums together.  I made sure the release dates on the songs are different, and even tried

  • My ipod on the setting it says no wifi what do i do to fix it

    my ipod on the setting it says no wifi what do i do to fix it              please help me hurry

  • Inter rack Patching advice

    Hi All We are designing a new Comms equipment room for our client. For Interconnectivity between core devices between two racks in the same Comms room, is it best practice to go via a patch panel, (either in the same rack or external patch panel). My