Workflow & Webservices: data flow issue

Hi
Can anyone please guide me on the following.
How the data move between task container to webservice( or to BSP page). I did required binding between workflow & task container elements.
In my case, Right BSP page[which is associate with the TASK] is being opened in browser, when workitem is executed; But data is not visible on the page.
I have maintained the ParameterIDs while creating/generating the TASK from bsp page from wf_extsrv. They are visible in workflow container.
With what these elements should be binded? Am i missing any step.
Thanking you in advance.
Regards,
Pramod.
Message was edited by: K Reddy

Hello Arghadip,
Yes, the attribute is empty and not is mandatory. These error occur in Standard Workflow SAP and Customer Workflow (my developments).
These error occurs if <b>some information</b> (attribute) of the workflow container that is used in <b>some data flow</b> (binding definition) will be empty.
Example: I have a SendMail step in my workflow, and the email address is one attribute from business object. Before this step (sendmail), when the previous step is concluded and if the attribute (mail address) is empty, the error ocurr's.
I believe that the email would not have to be sent for nobody, and not ocurr's an error. I think that these problem is one support package error.
Thanks,
Kleber

Similar Messages

  • Issue with webservice data control while invokin bpel process from ADF side

    Hello Experts,
    we are creating webservice data control to invoke bpel process from ADF side.
    When I run the application and inovoke the bpel every thing is working fine.
    But when we are changeing the instances from DEV to SIT the bpel url(hostname and port) will change.
    So we just edited DataControls.dcx file and we just replaced the wsdl url and trying to run the applicaiton.
    This time the applicaiton is not invoking the bpel process.
    If I remove the webservice data control and recreate it with new WSDL url then it is working fine.
    Can any one tell us what the exact issue is. Is there any other files do i need to modify or etc.
    Thanks & Regards
    Gayaz

    DataControls.dcx & connections.xml (.adf/META-INF)

  • Data flow error in workflow runtime

    I have many workflow's (standard or not) that present error in runtime.
    The data flow between container task and container workflow doesn't work if one element is empty, but this element not is mandatory.
    List of errors:
    - ParForEach 000000
    - Object FLOWITEM method EXECUTE cannot be executed
    - and others...
    These errors occurs if <b>some information</b> of the workflow container that is used in <b>some data flow</b> (binding definition) will be <b>empty</b>, the workflow will present error in the start.
    List of specific error:
    Error during result processing of work item 000000395235
    Error when processing node '0000000083' (ParForEach index 000000)
    Error when creating a component of type 'Etapa'
    Error when creating a work item
    Error within method CL_SWF_RUN_WIM_BATCH->_CREATE_WORKITEM_CONTAINER
    Source (expression '&STANDARDMATERIAL.MAILCENTRAL&') of binding assignment is not available
    Source (expression '&STANDARDMATERIAL.MAILCENTRAL&') of binding assignment is not available
    Error in the evaluation of expression '&STANDARDMATERIAL<???>.MAILCENTRAL&' for item '17'
    Error when determining attribute 'MAILCENTRAL' of object instance '[BO.BUS1001006.000000000010279
    Error in the evaluation of expression '&STANDARDMATERIAL<???>.MAILCENTRAL&' for item '17'
    Error when determining attribute 'MAILCENTRAL' of object instance '[BO.BUS1001006.000000000010279
    These errors did not occur before the Support Package SAPKB70012.

    Hello Arghadip,
    Yes, the attribute is empty and not is mandatory. These error occur in Standard Workflow SAP and Customer Workflow (my developments).
    These error occurs if <b>some information</b> (attribute) of the workflow container that is used in <b>some data flow</b> (binding definition) will be empty.
    Example: I have a SendMail step in my workflow, and the email address is one attribute from business object. Before this step (sendmail), when the previous step is concluded and if the attribute (mail address) is empty, the error ocurr's.
    I believe that the email would not have to be sent for nobody, and not ocurr's an error. I think that these problem is one support package error.
    Thanks,
    Kleber

  • Issues with result data display on ADF page from a Webservice data control

    Hi,
    I created a Webservice Data control and created a JSF to display the webservice response to the screen.
    I dragged and dropped the input paramater to the JSF form.Also i did the same for the output result also.i drag & drop the result tag to the JSF(selected the read only form).
    This webservice is a complex input and output params.
    After supplying the input param to the jsf and clicked on submit, the request is going and hitting the webservice and getting proper result set also.
    But the issue is , the result is not displayed on the JSF screen.
    Is there any configuration needed to diaply the content on ths screen?
    Version - JDev 11.1.1.4.0
    Regards,
    JJ

    Dear Vinod,
    Thanks for the reply.
    How to refresh the data container ..just to press the refresh button(F5) or is there any configuration needed to auto refresh the data container?
    The following is my table definition
    <af:table rows="#{bindings.LookListRow.rangeSize}"
    fetchSize="#{bindings.LookListRow.rangeSize}"
    emptyText="#{bindings.LookListRow.viewable ? 'No data to display.' : 'Access Denied.'}"
    var="row"
    value="#{bindings.LookListRow.collectionModel}"
    rowBandingInterval="0"
    selectedRowKeys="#{bindings.LookListRow.collectionModel.selectedRow}"
    selectionListener="#{bindings.LookListRow.collectionModel.makeCurrent}"
    rowSelection="single"
    binding="#{backingBeanScope.backing_lkUp.t1}"
    id="t1" columnSelection="single">
    <af:column headerText="#{bindings.LookListRow.hints.rowAction.label}"
    sortProperty="rowAction" sortable="false"
    id="c5">
    <af:outputText value="#{row.rowAction}" id="ot4"/>
    </af:column>
    <af:column headerText="#{bindings.LookListRow.hints.FieldValue.label}"
    sortProperty="FieldValue" sortable="false"
    id="c8">
    <af:outputText value="#{row.FieldValue}" id="ot8"/>
    </af:column>
    <af:column headerText="#{bindings.LookListRow.hints.FieldName.label}"
    sortProperty="FieldName" sortable="false"
    id="c2">
    <af:outputText value="#{row.FieldName}" id="ot7"/>
    </af:column>
    <af:column headerText="#{bindings.LookListRow.hints.Description.label}"
    sortProperty="Description" sortable="false"
    id="c1">
    <af:outputText value="#{row.Description}" id="ot2"/>
    </af:column>
    <af:column headerText="#{bindings.LookListRow.hints.StatusasofEffectiveDate.label}"
    sortProperty="StatusasofEffectiveDate"
    sortable="false" id="c4">
    <af:outputText value="#{row.StatusasofEffectiveDate}"
    id="ot5"/>
    </af:column>
    <af:column headerText="#{bindings.LookListRow.hints.LanguageCode.label}"
    sortProperty="LanguageCode" sortable="false"
    id="c6">
    <af:outputText value="#{row.LanguageCode}" id="ot6"/>
    </af:column>
    <af:column headerText="#{bindings.LookListRow.hints.Version.label}"
    sortProperty="Version" sortable="false" id="c3">
    <af:outputText value="#{row.Version}" id="ot3">
    <af:convertNumber groupingUsed="false"
    pattern="#{bindings.LookListRow.hints.Version.format}"/>
    </af:outputText>
    </af:column>
    Regards,
    -JJ
    Edited by: user13117752 on Jun 14, 2011 11:20 PM

  • Data Flow is not working in Workflow

    Hello!
    I want to start my workflow with 'SAPI_WAPI_CREATE_EVENT'.
    But the workflow is not starting because it has error in the data flow of its task. Goal of the workflow is that the event send an email to the employee.
    what should i do in the data flow? what have I put in the data flow?

    I created an ABAP-Class with the event 'create_wf' and the method 'create_workflow' .
    This class includes the interfaces bi_object, bi_persistent and if_workflow.
    My method 'create_workflow' call the FM 'SAP_WAPI_CREATE_EVENT'. But what are the exporting parameter 'object_type' and object_key' and 'event'? Is 'event' my own event 'create_wf'? And where I define the event? (My report calling the method 'create_workflow'. Is this the definition?)
    In my workflow I have the task and I don't know which objects or parameters to put in in the data flow? In the task (flag 'basic data') I have the object category APAB class with the object type 'zsieb_startwf' (name of my class) and the method i explaind before 'create_wf' (which is my event in the class)'. in the flag event trigger I put the same data again.
    Did I forget anything?
    Where do I integrate my workflow in my class? Or is this unnecessary?
    Hope, you understand now my problem better.
    Hi Anna,
    I think you may be confusing 'events' as they relate to Workflow and 'events' as they exist in classes.
    There is a table which contains business object related events, SWETYPV.  This provides the linkage between your business object (such as Invoice, BUS2081, or Purchase Order, BUS2012) and Event (Created, Changed, etc) and your workflow.
    You need to know what object type and what event (in the workflow context) are required before you can proceed.
    So if you are trying to start a workflow from SAP_WAPI_CREATE_EVENT, your import parameters might be:
      call function 'SAP_WAPI_CREATE_EVENT'
        exporting
          object_type             = 'BUS2081'
          object_key              = yourdocumentkey
          event                       = 'CREATED'
          commit_work             = 'X'
          event_language          = sy-langu
        importing
          return_code             = lv_retcode
          event_id                = lv_eventid.
    Then, in transaction SWETYPV, you would have an entry linking the Object/Event pair to the 'receiver' - the Workflow Template, and use a Receiver Function Module such as SWW_WI_CREATE_VIA_EVENT_IBF to actually raise the event.
    Regards,
    Sue

  • Issue with Data flow between Unicode and Non Unicode systems

    Hello,
    I have scenario as below,
    We have  a Unicode – ECC 6.0 and a UTF 7 – Legacy system.
    A message flow between Legacy system to ECC 6.0 system and the data is of 700 KB size.
    Will there be any issue in this as one is Unicode and other is non Unicode?
    Kindly let me know.
    Thanks & Regards
    Vivek

    Hi,
    To add to Mike's post...
    You indicate that your legacy system is non-Unicode and the ERP system is Unicode.  You also said that the data flow is only <i>from</i> the legacy system <i>to</i> the ERP system.  In this case, you should have no data issues, since the Unicode system is the receiving system.  There <b>are</b> data issues when the data flow is in the other direction: <i>from</i> a Unicode system <i>to</i> a non-Unicode system.  Here, the non-Unicode system can only process characters that exist on its codepage and care must be taken from sending systems to ensure that they only send characters that are on the receiving system's codepage (as Mike says above).
    Best Regards,
    Matt

  • Config issue:  no data flow to profit center

    Hello,
    I'm on ECC6.0, and I'm trying to do a GL posting using FB50 to profit center and do an profit center assessment , but when I go to report KE5Z ( profit center actual line item report ), I don't see any value flow to the profit center. Due to this , my profit center assessment cycle is given warning message saying " cycle doesn't contains any senders" .
    Are there any configure steps need to be done in order to let the data flow to profit center?
    please advise.

    version should set to 1.

  • Data Service 4.2 upgrade issue - R/3 abap data flow error

    This error makes sense if you get it in PROD environment. But any idea if this can occur if we run against ECC-DEV environment.
    I don't think it makes sense to use execute preloaded option against DEV
    Steps performed for connecting to ECC through DS 4.2:
    1. Basis Imported the new functions into ECC which we got after raising an OSS with them.
    2. Gave the authorizations as per the manual.
    &#9679; S_BTCH_JOB &#9679; S_DEVELOP &#9679; S_RFC &#9679; S_TABU_DIS &#9679; S_TCODE
    3. Ran a simple R3 Data flow(Shared Directory transfer method) which resulted in error RFC_ABAP_INSTALL_AND_RUN:RFC_ABAP_MESSAGE, changes to repository object are not permitted in the client.
    Do we need more permissions than listed above to avoid this error??

    Hello,
    I run 'R3trans -x' command, but there was no problem - connection to database was working.
    Problem was following:
    Before starting the sdt service on host, I set environment variables JAVA_HOME and LD_LIBRARY_PATH for sidadm. That's not neccessary and that was the problem. Without setting these variables it is working now.
    Thanks,
    Julia

  • Using asynchronous timer for data flow control

    Hi all,
      I am using system sleep to control the data flow (some digital lines and analog output). The pseudo code is something like this
    Sleep(150);
    // the following sections are exectuted in parallel
      #pragma omp parallel sections
        #pragma omp section
          DAQmxWriteDigitalLines(...); // output TTL to one digitla line
        #pragma omp section
          DAQmxWriteDigitalLines(...); // output TTL to another digitla line     
        #pragma omp section
          Sleep(2); // sleep 2ms
    // the following sections are exectuted in parallel
      #pragma omp parallel sections
        #pragma omp section
          DAQmxWriteDigitalLines(...); // output TTL to one digitla line
        #pragma omp section
          DAQmxWriteAnalogScalarF64(...); // analog output to one channel
        #pragma omp section
          Sleep(1); // delay 1ms
    // the following sections are exectuted in parallel
      #pragma omp parallel sections
        #pragma omp section
          DAQmxWriteDigitalLines(...); // output TTL to one digitla line
        #pragma omp section
          DAQmxWriteAnalogScalarF64(...); // analog output to one channel
    #pragma omp section
          DAQmxWriteAnalogScalarF64(...); // analog output to another channel
        #pragma omp section
          Sleep(11); // delay 11ms
    // ... other stuffs
    I am running windows XP and I know it is not possible to get realtime control but  I want a as precise timing as possible. Above code is not perfect but it works 95% of times. I just read an article about using the asynchronous timer to control the time delay. I try that idea with the following code frame
    int CVICALLBACK ATCallback(int reserved, int timerId, int event, void *callbackData, int eventData1, int eventData2)
    if (event==EVENT_TIMER_TICK)
    int *nextdelay = (int *)callbackData;
    SuspendAsyncTimerCallbacks();
    if (timerId>=0)
    double time;
    if (*nextdelay==0) time=2.0;
    else if (*nextdelay==1) time=1.0;
    else time=12.0;
    SetAsyncTimerAttribute(timerId, ASYNC_ATTR_INTERVAL, time);
    if (*nextdelay==0)
    #pragma omp parallel sections
    #pragma omp section
    DAQmxWriteDigitalLines(...); // output TTL to one digitla line
    #pragma omp section
    DAQmxWriteDigitalLines(...); // output TTL to another digitla line
    *nextdelay++;
    else if (*nextdelay==2)
    #pragma omp parallel sections
    #pragma omp section
    DAQmxWriteDigitalLines(...); // output TTL to one digitla line
    #pragma omp section
    DAQmxWriteAnalogScalarF64(...); // analog output to one channel
    *nextdelay++;
    else if (*nextdelay==3)
    #pragma omp parallel sections
    #pragma omp section
    DAQmxWriteDigitalLines(...); // output TTL to one digitla line
    #pragma omp section
    DAQmxWriteAnalogScalarF64(...); // analog output to one channel
    #pragma omp section
    DAQmxWriteAnalogScalarF64(...); // analog output to another channel
    *nextdelay++;
    ResumeAsyncTimerCallbacks();
    return 0;
    void main(void)
    int n = 0;
    int timeid;
    timeid = NewAsyncTimer(120.0/1000.0, 3, 1, ATCallback, &n);
    But it doesn't work. There is no compilation and runtime error but the timing just not right. I wonder do I have to suspend the timer in the callback function when I reset the delay for next call? If I do so, I am worry if it will apply too much delay (since I suspend and resume the timer in the delay) so it will cause even worse timing. But if I don't suspend the timer before I reset the time, what happen if the code running in the callback function not finished before the next callback arrive. It is quite confusing how to use asynchronous timer in this case. Thanks.

    Yeah, unfortunately the 6711 doesn't have clocked digital I/O.  There are only two counters anyway so even if you could use them to generate your signals you wouldn't have enough (*maybe* something with the 4 AO channels and a counter depending on what your output signals need to look like?  The AO channels can output "digital" as well if you write 0V or 5V only).
    A PCI DAQ card which does support clocked digital I/O and has 2 analog outputs is the 6221 (or if you could use PCIe the 6321 is a more updated version with two extra counters and some additional functionality).
    If there isn't a way to implement clocked outputs afterall, one thing you could do to make your code a little more efficient is to consolidate the writes.  You can put your digital lines into a single task and write them at ocne, and you can put your analog channels into a single task and write them at once as well.
    I'm not sure about the callback issue, you might find some more help in the CVI forum.  I don't think it's going to solve your underlying problem though as ultimately the execution timing of your software calls is at the mercy of your OS.
    Best Regards,
    John Passiak

  • Display Data Flow - Short Dump

    Hi all,
    When i select display data flow of any cube...it is going for a short dump.
    I have searched for the answer in previous Forum questions. I could find only for previous BW versions but not for for BI7.
    Could you please let me know the solution for this issue.
    Thanks & Regards,
    Eswari

    Hi All,
    Thank you very much for all of your responces.....
    I am working on Support Package 10.
    Here is the detailed description of the short dump.
    Short text
        The current application triggered a termination with a short dump.
    What happened?
        The current application program detected a situation which really
        should not occur. Therefore, a termination with a short dump was
        triggered on purpose by the key word MESSAGE (type X).
    What can you do?
        Note down which actions and inputs caused the error.
        To process the problem further, contact you SAP system
        administrator.
        Using Transaction ST22 for ABAP Dump Analysis, you can look
        at and manage termination messages, and you can also
        keep them for a long time.
    Error analysis
        Short text of error message:
        GP: Control Framework returned an error; contact system administrator
        Long text of error message:
         Diagnosis
             The Graphical Framework is based on the basis technology known as
             the Control Framework. A method in the Control Framework returned
             an error.
         Procedure
             It probably involves a programming error. You should contact your
             system administrator.
         Procedure for System Administration
             Check the programming of the graphics proxy especially for the
             parameters that were sent and, if necessary, correct your program.
        Technical information about the message:
        Message class....... "APPLG"
        Number.............. 229
        Variable 1.......... " "
        Variable 2.......... " "
        Variable 3.......... " "
        Variable 4.......... " "
    How to correct the error
        Probably the only way to eliminate the error is to correct the program.
        If the error occures in a non-modified SAP program, you may be able to
        find an interim solution in an SAP Note.
        If you have access to SAP Notes, carry out a search with the following
        keywords:
        "MESSAGE_TYPE_X" " "
        "CL_AWB_OBJECT_NET_SAPGUI======CP" or "CL_AWB_OBJECT_NET_SAPGUI======CM005"
        "PBO"
        If you cannot solve the problem yourself and want to send an error
        notification to SAP, include the following information:
        1. The description of the current problem (short dump)
           To save the description, choose "System->List->Save->Local File
        (Unconverted)".
        2. Corresponding system log
           Display the system log by calling transaction SM21.
           Restrict the time interval to 10 minutes before and five minutes
        after the short dump. Then choose "System->List->Save->Local File
        (Unconverted)".
        3. If the problem occurs in a problem of your own or a modified SAP
        program: The source code of the program
           In the editor, choose "Utilities->More
        Utilities->Upload/Download->Download".
       4. Details about the conditions under which the error occurred or which
       actions and input led to the error.
    Thanks,
    Eswari.

  • Error on Data Flow Task MSSQL 2012 Clustered "Description: The version of Lookup is not compatible with this version of the DataFlow. "

    We have an SSIS package that runs on clustered MSSQL 2012 Enterprise Nodes that is failing.  We use a job to executer the package.
    Environmental information:
    Product - Microsoft SQL Server Enterprise: Core-based Licensing (64-bit)
    Operating System - Microsoft Windows NT 6.1 (7601)
    Patform - NT x64
    Version - MSSQL Version 11.0.3349.0
    Package is set to 32 -bit.  All permissions verified.  Runs in lower environments, same MSSQL version.  All environments are clustered.  In the failing environment, all nodes are at the same service pack.  I have not verified if all
    nodes in the failing environment have SSIS installed.  Data access is installed.  We have other simpler packages that run in this environment, just not this one.  Time to ask the community for help!
    Error:
    Source: Data Flow Task - Data Flow Task (SSIS.Pipeline)     Description: The version of Lookup is not compatible with this version of the DataFlow.  End Error  Error:  Code: 0xC0048020    
    Description: Component "Conditional Split, clsid {7F88F654-4E20-4D14-84F4-AF9C925D3087}" could not be created and returned error code 0x80070005 "Access is denied.". Make sure that the component is registered correctly.  End Error 
    Description: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "Conditional Split;Microsoft Corporation; Microsoft SQL Server; (C) Microsoft Corporation; All Rights
    Reserved; http://www.microsoft.com/sql/support;0".  End Error 
    (Left out shop specific information.  This is the first error in the errors returns by the job history for this package. )
    Thanks in advance.

    Hi DeveloperMax,
    According to your description, the error occurs when you execute the package with Agent job on clustered MSSQL 2012 Enterprise Nodes.
    As per my understanding, I think this issue can be caused by you use SQL Server Agent to schedule a SQL Server Integration Services package in a 64-bit environment. And the SSIS package is referencing some 32-Bit DLL or 32-Bit drivers which are available
    only in 32-bit versions, so the job failed.
    To fix this issue, we should use the 32-bit version of the DTExec.exe utility to schedule the 64-bit SQL Server Agent to run a package. To run a package in 32-bit mode from a 64-bit version of SQL Server Agent, we can go to the Job Step dialog box, then
    select “32 bit runtime” in the Advanced tab.
    Besides, we should make sure that SQL Server Integration Services is installed on the failing environment.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Data flows are getting started but not completing successfully while extracting/loading of the data

    Hello People,
    We are facing a abnormal behavior with the dataflows in the data services job.
    Scenario:
    We are extracting the data from CRM end in parallel. Please refer the build:
    a. We have 5 main workflows flows i.e :
       => Main WF1 has 6 more sub Wf's in it, in which each sub Wf has 1/2 DF's associated in parallel.
       => Main WF2 has 21 DF's and 1 WFa->with a DF & a WFb. WFb has 1 DF in parallel.
       => Main WF3 has 1 DF in parallel.
       => Main WF4 has 3 DF in parallel.
       => Main WF5 has 1 WF & a DF in sequence.
    b. Regularly the job works perfectly fine but, sometimes it gets stuck at the DF’s without any error logs.
    c. Job doesn’t stuck at a specific dataflow or on a specific day, many a times it strucks at different DF’s.
    d. Observations in the Monitor Log:
    Dataflow---------------------- State----------------RowCnt------LT-------AT------ 
    +DF1/ZABAPDF
    PROCEED
    234000
    8.113      394.164
    /DF1/Query
    PROCEED
    234000
    8.159      394.242
    -DF1/Query_2
    PROCEED
    234000
    8.159      394.242
    Where LT: Lapse Time and AT: Absolute time
    If you check the monitor log, the State of the Dataflow DF1 remains PROCEED till the end, ideally it should complete.
    In successful jobs, the status for DF1  is STOP . This DF takes approx. 2 min to execute.
    The row count for DF1 extraction is 234204 but, it got stuck at  234000.
    Then we terminate the job after sometime,but for surprise it gets executed successfully on next day.
    e. As per analysis over all the failed jobs, same things were observed over the different data flows that got stuck during the execution.Logic related to the data flows is perfectly fine.
    Observations in the Trace log:
    DATAFLOW: Process to execute data flow <DF1> is started.
    DATAFLOW: Data flow <DF1> is started.
    ABAP: ABAP flow <ZABAPDF> is started.
    ABAP: ABAP flow <ZABAPDF> is completed.
    Cache statistics determined that data flow <DF1>
    uses <0>caches with a total size of <0> bytes. This is less than(or equal to) the virtual memory <1609564160> bytes available for caches.
    Statistics is switching the cache type to IN MEMORY.
    DATAFLOW: Data flow <DF1> using IN MEMORY Cache.
    DATAFLOW: <DF1> is completed successfully.
    The highlighted text in the trace log is not appearing in the unsuccessful job but, it appears for the successful one.
    Note: The cache type is pageable cache, DS ver is 3.2.
    Please suggest.
    Regards,
    Santosh

    Hi Santosh,
    just a wild guess.
    Would you be able to replicate all the DF\WF , delete original DF\WF, rename replicated objects to original to DF\WF names(for your convenience)   and excute it.
    Some time reference does not work.
    Hope this should work.
    Regards,
    Shiva Sahu

  • Data flow fails on packed decimal field moving iSeries DB2 data from one iSeries DB to another

    I' trying to use SSIS to move table content from one iSeries DB2 database to another.  I'm using the .Net providers for OleDb\IBM DB2 for i5/OS IBMDA400 OLE DB Provider in the connection managers for the source and destination and the test connection
    works fine.  When I try to run the data flow task however it fails on the first packed decimal field it encounters with the exceptions ...
    [select from hydro520 hydroweb2 blpmstr [16]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "component "select from hydro520 hydroweb2 blpmstr" (16)" failed because error code 0x80004002 occurred, and the error
    row disposition on "output column "MSPRIB" (55)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
    [select from hydro520 hydroweb2 blpmstr [16]] Error: The component "select from hydro520 hydroweb2 blpmstr" (16) was unable to process the data. Pipeline component has returned HRESULT error code 0xC0209029 from a method call.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "select from hydro520 hydroweb2 blpmstr" (16) returned error code 0xC02090F5.  The component returned a failure code when the pipeline
    engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
    ...in the progress tab.  Can someone kindly tell me what I need to do to get the connection manager to work with DB2 packed decimal fields?  Or is it a different issue all together?  Thanks tonnes for any help, Roscoe

    Hi rpfinn,
    From the Data Types mapping rules between SSIS and DB2, we can see that both the NUMERIC and DECIMAL data types in DB2 are mapped to DT_NUMBERIC data type in SSIS. Now that the source data in your DB2 database is NUMERIC data type, changing the DT_NUMBRIC
    data type to DT_DECIMAL is invalid. Besides, if we check the data types of the target External column and Output column from the Advanced Editor for ADO NET Source adapter, the data type should be defined as DT_NUMERIC with Precision as 9 and Scale as 2. I
    am not clear where you see the DT_NUMBERIC(9,0) e.g. DT_NUMERIC with Precision as 9 and Scale as 0, but it may be the cause of the issue. You need to make sure the DT_NUMERIC data type also has Scale 2 instead of 0.
    If you don’t know how to modify the data type, please elaborate the Data Flow Task of the package so that we can make further analysis. Besides, the error messages you posted are incomplete, it will be helpful if you post the complete error message.
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Problem with context mapping and data flow in a FPM application

    Hi All,
    I am trying to develop an ESS application using FPM. For the same, the requirement is to see the history of an employee in the second view.
    The first view has got just the overview information and the second one has got the detail. So, the records or the fields are the same on both the views.
    As per the FPM guidelines, the Model is residing in the Fc component and the respective Vc components are using the model data accordingly.
    I am executing the model in the Fc component calling the executable method in the interfaceController of the first view and then trying to display the output data of the BAPI in the first view which provides the overview information.This is working fine.
    But when i am trying to map the same output node to the Table UI for the second view, the record size is coming zero and thus no information is available.
    For the above issue, I am again executing the RFC in the InterfaceController of the second view to populate the records, which is incorrect as it is already executed and the data is available for the first view.
    I request you to let me know the correct approach to Context mapping and data flow when using FPM-roadmap. Is their any standard method or approach available to deal with such requirements? Please let me know.
    Thanks in advance.
    Regards
    DK

    Hi Idhaya,
    I model node is available in Fc and the Fc interface controller is being used in the first Vc and the second Vc.
    So the idea is, as the executable method is generated in the Fc, so i have created a custom method to call the executable method in Fc, where the input parameter is getting passed and this custom method is finally getting called is the first Vc.
    So , now my first Vc is ready to call the custom method in Fc and execute the RFC. Once the RFC is executed, the nodes in the Fc should get populated which is the ideal case.
    And as the Fc is used as a component in the second Vc, the same node is available to the UI elements.
    But, when I check the record size for the output node, it is always zero, for the second Vc.
    Regards
    DK

  • Data Flow terminated due to error 120307

    Hi.
    I get this error when executed  project.
    Source system: SyBase IQ.
    Target system: SAP HANA.
    Part of tables copied successfully, but job terminated anyway.
    I attached screenshots with Progress screen and Monitoring.
    Error log is empty.
    In trace log I see errors from SUBJ.
    Also I have another one strange message in trace log:
    Cache statistics determined that data flow <SYBASE_IQ_2_HOD_DBA_FACT_FINAL> uses 0 caches with a total size of 0 bytes, which is less than (or equal to) 3757047808 bytes available for caches in virtual memory. Data flow will use IN MEMORY cache type.

    I executed this job from data services designer and get another error.
    main Bufman: An error was detected on a database page. You may have a damaged index. For additional information, please check your IQ message file or run sp_iqcheckdb
    Trying to find issue in goggle

Maybe you are looking for