Abap Data flow error in BODS

Hi All,
I have started to run a simple abap data flow in BODS so the data flow has been designed and it is executed then the below error is thrown.
It seems to be some abap drivers issues , not the design flow.
so anyone please suggest the actual issue here and let me know you need any further information.
Thanks.
Best Regards,
Edu

Hi Konakanchi,
Please share the Job execution log for more details like RFC Configuration issue or BODS Issue, Mean while can you please check the RFC Connection Test through SAPGUI.
Thanks,
Daya

Similar Messages

  • Data Service 4.2 upgrade issue - R/3 abap data flow error

    This error makes sense if you get it in PROD environment. But any idea if this can occur if we run against ECC-DEV environment.
    I don't think it makes sense to use execute preloaded option against DEV
    Steps performed for connecting to ECC through DS 4.2:
    1. Basis Imported the new functions into ECC which we got after raising an OSS with them.
    2. Gave the authorizations as per the manual.
    ● S_BTCH_JOB ● S_DEVELOP ● S_RFC ● S_TABU_DIS ● S_TCODE
    3. Ran a simple R3 Data flow(Shared Directory transfer method) which resulted in error RFC_ABAP_INSTALL_AND_RUN:RFC_ABAP_MESSAGE, changes to repository object are not permitted in the client.
    Do we need more permissions than listed above to avoid this error??

    Hello,
    I run 'R3trans -x' command, but there was no problem - connection to database was working.
    Problem was following:
    Before starting the sdt service on host, I set environment variables JAVA_HOME and LD_LIBRARY_PATH for sidadm. That's not neccessary and that was the problem. Without setting these variables it is working now.
    Thanks,
    Julia

  • Creating abap data flow, open file error

    hello experts,
    i am trying to pull all the field of MARA table in BODS.
    so i m using abap data flow.but after executing the job i got error "cant open the .dat file"
    i am new to abap data flow so i think may be i did some mistake in configuration of datastore.
    can any one guide me how to create a datastore for abap data flow???

    In your SAP Applications datastore, are you using "Shared Directory" or "FTP" as the "Data transfer method"?  Given the error, probably the former.  In that case, the account used by the Data Services job server must have access to wherever SAP is putting the .DAT files.  When you run an ABAP dataflow, SAP runs the ABAP extraction code (of course) and then exports or saves the results to a .DAT file, which I believe is just a tab-delimited flat text file, in the folder "Working directory on SAP server." This is specified from the perspective of the SAP server, e.g., "E:\BODS\TX," where the E:\BODS\TX folder is local to the SAP application server. I believe this folder is specified as a directive to the ABAP code, telling SAP where to stick it (the .DAT files). The DS job server then picks it up from there, and you tell it how to get there via "Application path to the shared directory," which, in the above case, might be
    SAPDEV1\BODS\TX" if you shared-out the E:\BODS folder as "BODS" and the SAP server was SAPDEV1.  Anyway: the DS job server needs to be able to read files at
    SAPDEV1\BODS\TX, and may not have any rights to do so, especially if it's just logging-in as Local System.  That's likely your problem. In a Windows networking environment, I always have the DS job server log-in using an AD account, which then needs to be granted privileges to the, in our example's case,
    SAPDEV1\BODS\TX folder.  Also comes in handy for getting to data sources, sometimes.
    Best wishes,
    Jeff Prenevost
    Data Services Practice Manager
    itelligence

  • Using ABAP DATA FLOW to pull data from APO tables

    I am trying to use an ABAP Data flow to pull data from APO and receive error 150301. I can do a direct table pull and receive no error, but when I try to put it in an ABAP data data flow I get the issue. Any help would be great.

    Hi
    I know you "closed" this, however someone else might read it so I'll add that when you use an ABAP dataflow, logic can be pushed to ECC - table joins, filters, etc.  (Which can be seen in the generated ABAP).
    Michael

  • DS 4.2 get ECC CDHDR deltas in ABAP data flow using last run log table

    I have a DS 4.2 batch job where I'm trying to get ECC CDHDR deltas inside an ABAP data flow.  My SQL Server log table has an ECC CDHDR last_run_date_time (e.g. '6/6/2014 10:10:00') where I select it at the start of the DS 4.2 batch job run and then update it to the last run date/time at the end of the DS 4.2 batch job run.
    The problem is that CDHDR has the date (UDATE) and time (UTIME) in separate fields and inside an ABAP data flow there are limited DS functions.  For example, outside of the ABAP data flow I could use the DS function concat_date_time for UDATE and UTIME so that I could have a where clause of 'concat
    _date_time(UDATE, UTIME) > last_run_date_time and concat_date_time(UDATE, UTIME) <= current_run_date_time'.  However, inside the ABAP data flow the DS function concat_date_time is not available.  Is there some way to concatenate UDATE + UTIME inside an ABAP data flow?
    Any help is appreciated.
    Thanks,
    Brad

    Michael,
    I'm trying to concatenate date and time and here's my ABAP data flow where clause:
    CDHDR.OBJECTCLAS in ('DEBI', 'KRED', 'MATERIAL')
    and ((CDHDR.UDATE || ' ' || CDHDR.UTIME) > $CDHDR_Last_Run_Date_Time)
    and ((CDHDR.UDATE || ' ' || CDHDR.UTIME) <= $Run_Date_Time)
    Here are DS print statements showing my global variable values:
    $Run_Date_Time is 2014.06.09 14:14:35
    $CDHDR_Last_Run_Date_Time is 1900.01.01 00:00:01
    The issue is I just created a CDHDR record with a UDATE of '06/09/2014' and UTIME of '10:48:27' and it's not being pulled in the ABAP data flow.  Here's selected contents of the generated ABAP file (*.aba):
    PARAMETER $PARAM1 TYPE D.
    PARAMETER $PARAM2 TYPE D.
    concatenate CDHDR-UDATE ' ' into ALTMP1.
    concatenate ALTMP1 CDHDR-UTIME into ALTMP2.
    concatenate CDHDR-UDATE ' ' into ALTMP3.
    concatenate ALTMP3 CDHDR-UTIME into ALTMP4.
    IF ( ( ALTMP4 <= $PARAM2 )
    AND ( ALTMP2 > $PARAM1 ) ).
    So $PARAM1 corresponds to $CDHDR_Last_Run_Date_Time ('1900.01.01 00:00:01') and $PARAM2 corresponds to $Run_Date_Time ('2014.06.09 14:14:35').  But from my understanding ABAP data type D is for date only (YYYYMMDD) and doesn't include time, so is my time somehow being defaulted to '00:00:00' when it gets to DS?  I ask this as a CDHDR record I created on 6/6 wasn't pulled during my 6/6 testing but this 6/6 CDHDR record was pulled today.
    I can get  last_run_date_time and current_run_date_time into separate date and time fields but I'm not sure how to build the where clause using separate date and time fields.  Do you have any recommendations or is there a better way for me to pull CDHDR deltas in an ABAP data flow using something different than a last run log table?
    Thanks,
    Brad

  • ABAP Data flow

    Hi
    can we replicate ABAP data flow and do modifications for history data upload?

    It is  a copy and whatever changes you will make is not going to impact the other ABAP dataflows.

  • Totem + GStreamer + srt subtitle : Internal data flow error

    Hello all,
    I am trying to watch a video with a subtitle in "srt" format, with Totem + GStreamer.
    I get a popup with the error message : "Internal data flow error" and my video cannot play.
    If I remove the subtitle file, I have no problem watching the video. I do not have this problem with totem + xine, but I want to use gstreamer ...
    I have posted a bug report at Gnome.
    Do you have the same issue ?
    Can any body help ?
    Thank you very much
    Cheers,
    Chicha
    Last edited by chicha (2007-10-05 15:16:52)

    Thank you Don-DiZzLe for your help
    I have tried with totem-xine and it worked. But there is something annoying with totem-xine : you cannot change the subtitle font size
    That is why I tried totem-gstreamer and found this bug.
    This is supposed to work according to GStreamer's website, so if I could help fixe this I would be very happy

  • ABAP Data Flows - Parallel Execution?

    Hi Guys,
    If I have a Data Flow that includes within it multiple, lets say 3, ABAP Data Flows; I see that when the job is started, only 1 ABAP flow at a a time will run even though there is no precedence enforced and they could all kick off together.
    Is there a way to get these ABAP flows to all run in parallel in SAP?
    Thanks,
    Flip.

    Hi Flip,
    Never actually tried to do this but I see that in the Performance Optimization guide they only specify that Dataflows and Workflows can be processed in parallel, so I would suppose then that if you placed each ABAP dataflow into its own separate dataflow and then encapsulated the 3 unlinked dataflows into a workflow then the ABAP dataflows should execute in parallel, since they are initiated by the dataflows.
    Clint.

  • Data flow error in workflow runtime

    I have many workflow's (standard or not) that present error in runtime.
    The data flow between container task and container workflow doesn't work if one element is empty, but this element not is mandatory.
    List of errors:
    - ParForEach 000000
    - Object FLOWITEM method EXECUTE cannot be executed
    - and others...
    These errors occurs if <b>some information</b> of the workflow container that is used in <b>some data flow</b> (binding definition) will be <b>empty</b>, the workflow will present error in the start.
    List of specific error:
    Error during result processing of work item 000000395235
    Error when processing node '0000000083' (ParForEach index 000000)
    Error when creating a component of type 'Etapa'
    Error when creating a work item
    Error within method CL_SWF_RUN_WIM_BATCH->_CREATE_WORKITEM_CONTAINER
    Source (expression '&STANDARDMATERIAL.MAILCENTRAL&') of binding assignment is not available
    Source (expression '&STANDARDMATERIAL.MAILCENTRAL&') of binding assignment is not available
    Error in the evaluation of expression '&STANDARDMATERIAL<???>.MAILCENTRAL&' for item '17'
    Error when determining attribute 'MAILCENTRAL' of object instance '[BO.BUS1001006.000000000010279
    Error in the evaluation of expression '&STANDARDMATERIAL<???>.MAILCENTRAL&' for item '17'
    Error when determining attribute 'MAILCENTRAL' of object instance '[BO.BUS1001006.000000000010279
    These errors did not occur before the Support Package SAPKB70012.

    Hello Arghadip,
    Yes, the attribute is empty and not is mandatory. These error occur in Standard Workflow SAP and Customer Workflow (my developments).
    These error occurs if <b>some information</b> (attribute) of the workflow container that is used in <b>some data flow</b> (binding definition) will be empty.
    Example: I have a SendMail step in my workflow, and the email address is one attribute from business object. Before this step (sendmail), when the previous step is concluded and if the attribute (mail address) is empty, the error ocurr's.
    I believe that the email would not have to be sent for nobody, and not ocurr's an error. I think that these problem is one support package error.
    Thanks,
    Kleber

  • Error when i try to read a R3 dat file in a data flow

    Dear all,
    The scenario is that i read data using R3 Data flow, from SAP R/3 and put that into a R3 flat file. then i try to read that file along wth some other table from R3 in a R3 data flow. But while executing it gives an error saying cant open file
    3208     2864     R3C-150607     8/23/2009 11:16:59 AM     |Dataflow DF_DeltaNewInfoRecord_SAP
    3208     2864     R3C-150607     8/23/2009 11:16:59 AM     Execute ABAP program <D:/ABAP_PUR/NewInforecord.aba> error <    Open File Error --  D:\ABAP_PUR/InfoRecord.dat>.
    1936     2556     R3C-150607     8/23/2009 11:16:59 AM     |Dataflow DF_DeltaNewInfoRecord_SAP
    1936     2556     R3C-150607     8/23/2009 11:16:59 AM     Execute ABAP program <D:/ABAP_PUR/NewInforecord.aba> error <    Open File Error --  D:\ABAP_PUR/InfoRecord.dat>.
    Can you please help me...thank you very much in advance.
    Regards
    Smijoe
    Equate Petrochemicals

    Place ABAP_PUR/InfoRecord.dat in SAP working directory. This should help.

  • Error on Data Flow Task MSSQL 2012 Clustered "Description: The version of Lookup is not compatible with this version of the DataFlow. "

    We have an SSIS package that runs on clustered MSSQL 2012 Enterprise Nodes that is failing.  We use a job to executer the package.
    Environmental information:
    Product - Microsoft SQL Server Enterprise: Core-based Licensing (64-bit)
    Operating System - Microsoft Windows NT 6.1 (7601)
    Patform - NT x64
    Version - MSSQL Version 11.0.3349.0
    Package is set to 32 -bit.  All permissions verified.  Runs in lower environments, same MSSQL version.  All environments are clustered.  In the failing environment, all nodes are at the same service pack.  I have not verified if all
    nodes in the failing environment have SSIS installed.  Data access is installed.  We have other simpler packages that run in this environment, just not this one.  Time to ask the community for help!
    Error:
    Source: Data Flow Task - Data Flow Task (SSIS.Pipeline)     Description: The version of Lookup is not compatible with this version of the DataFlow.  End Error  Error:  Code: 0xC0048020    
    Description: Component "Conditional Split, clsid {7F88F654-4E20-4D14-84F4-AF9C925D3087}" could not be created and returned error code 0x80070005 "Access is denied.". Make sure that the component is registered correctly.  End Error 
    Description: The component is missing, not registered, not upgradeable, or missing required interfaces. The contact information for this component is "Conditional Split;Microsoft Corporation; Microsoft SQL Server; (C) Microsoft Corporation; All Rights
    Reserved; http://www.microsoft.com/sql/support;0".  End Error 
    (Left out shop specific information.  This is the first error in the errors returns by the job history for this package. )
    Thanks in advance.

    Hi DeveloperMax,
    According to your description, the error occurs when you execute the package with Agent job on clustered MSSQL 2012 Enterprise Nodes.
    As per my understanding, I think this issue can be caused by you use SQL Server Agent to schedule a SQL Server Integration Services package in a 64-bit environment. And the SSIS package is referencing some 32-Bit DLL or 32-Bit drivers which are available
    only in 32-bit versions, so the job failed.
    To fix this issue, we should use the 32-bit version of the DTExec.exe utility to schedule the 64-bit SQL Server Agent to run a package. To run a package in 32-bit mode from a 64-bit version of SQL Server Agent, we can go to the Job Step dialog box, then
    select “32 bit runtime” in the Advanced tab.
    Besides, we should make sure that SQL Server Integration Services is installed on the failing environment.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Data Flow terminated due to error 120307

    Hi.
    I get this error when executed  project.
    Source system: SyBase IQ.
    Target system: SAP HANA.
    Part of tables copied successfully, but job terminated anyway.
    I attached screenshots with Progress screen and Monitoring.
    Error log is empty.
    In trace log I see errors from SUBJ.
    Also I have another one strange message in trace log:
    Cache statistics determined that data flow <SYBASE_IQ_2_HOD_DBA_FACT_FINAL> uses 0 caches with a total size of 0 bytes, which is less than (or equal to) 3757047808 bytes available for caches in virtual memory. Data flow will use IN MEMORY cache type.

    I executed this job from data services designer and get another error.
    main Bufman: An error was detected on a database page. You may have a damaged index. For additional information, please check your IQ message file or run sp_iqcheckdb
    Trying to find issue in goggle

  • Read from sql task and send to data flow task - [OLE DB Source [1]] Error: A rowset based on the SQL command was not returned by the OLE DB provider.

    I have created a execut sql task -
    In that, i have a created a 'empidvar' variable of string type and put sqlstatement = 'select distinct empid from emp'
    Resultset=resultname=0 and variablename=empidvar
    I have added data flow task of ole db type and I put this sql statement under sql command - exec emp_sp @empidvar=?
    I am getting an error.
    [OLE DB Source [1]] Error: A rowset based on the SQL command was not returned by the OLE DB provider.
    [SSIS.Pipeline] Error: component "OLE DB Source" (1) failed the pre-execute phase and returned error code 0xC02092B4.

    shouldnt setting be Result
    Set=Full Resultset as your query returns a resultset? also i think variable to be mapped should be of object type.
    Then for data flow task also you need to put it inside a ForEachLoop based on ADO.NET recordset and map your earlier variable inside it so as to iterate for every value the sql task returns.
    Also if using SP in oledb source make sure you read this
    http://consultingblogs.emc.com/jamiethomson/archive/2006/12/20/SSIS_3A00_-Using-stored-procedures-inside-an-OLE-DB-Source-component.aspx
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • "Syntax error or access violation" on Data Flow Task OLE DB Data Source

    I am implementing expression parameter for a SQL Server connection string (like this: http://danajaatcse.wordpress.com/2010/05/20/using-an-xml-configuration-file-and-expressions-in-an-ssis-package/)  and it works fine except when it reaches data flow
    task - OLE DB Source task. In this task, I execute a stored procedure like this: 
    exec SelectFromTableA ?,?,?
    The error message is this:
    0xC0202009 at Data Flow Task, OLE DB Source [2]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    An OLE DB record is available.  Source: "Microsoft OLE DB Provider for SQL Server"  Hresult: 0x80004005  Description: "Syntax error or access violation".
    Error: 0xC004706B at Data Flow Task, SSIS.Pipeline: "OLE DB Source" failed validation and returned validation status "VS_ISBROKEN"
    When I change the SQL command above with reading from table directly it works fine. I should also add that before changing connection string of the SQL data source to use expression, the SSIS package was working fine and I know that the connection string
    is fine because other tasks in the package works fine!
    Any idea why?

    Hi AL.M,
    As per my understanding, I think this problem is due to the mismatching between the source and the destination tables. We can reconfigured every of components of the package to check the table schemas and configuration settings, close the BIDS/SSDT and then
    open and try to see if there are errors.
    Besides, to trouble shoot this issue, we can use the variable window to see the variable's value. For more details, please refer to the following blog:
    http://consultingblogs.emc.com/jamiethomson/archive/2005/12/05/2462.aspx
    The following blog about “SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred: Reasons and troubleshooting” is for your reference:
    http://blogs.msdn.com/b/dataaccesstechnologies/archive/2009/11/10/ssis-error-code-dts-e-oledberror-an-ole-db-error-has-occurred-reasons-and-troubleshooting.aspx
    Hope this helps.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Data Flow from TXT to a table error

    Hello,
    I am trying to fill in the data from a .txt file I have into a table in a DB. Previously this worked fine in DTS and I can still do it when I import the DTS command but I want to update this to a data flow because the DTS commands needs to be run on 32 bit
    and I'm using 64 bit. 
    I'm getting 3 errors:
    [OLE DB Destination [322]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E21.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E21  Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
    [OLE DB Destination [322]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "input "OLE DB Destination Input" (335)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE
    DB Destination Input" (335)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
    [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE DB Destination" (322) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (335). The identified
    component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the
    failure.
    Before I changed the Flat File Source input advanced editor input and output properties to text stream [DT_TEXT] because the table has VarChar I also had an other error but this seems to be resolved. The only problem is if I look at the mappings the
    input is text stream [DT_TEXT] but the output is a string and I am unable to change this in the advanced editor of the OLE DB destination. I can change it but it changes back on it's own.
    Could I please get some help on these errors?
    Thanks

    Hi SQLNewbie101,
    According to your description, when you change column data type in the advanced editor of OLE DB Destination, it always changes back.
    Based on my research, the column data type is already confirmed by the destination table, it depends on the columns in the table, so we cannot change it.
    To fix this issue, one way as you said, we can use Data Conversion Transformation to convert the [DT_TEXT] data type to [DT_STR] after Flat File Source. Another way is directly change the column data type in the Advanced tab of Flat File Connection Manager
    Editor as below. Then double click the Flat File Source to update the columns.
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

Maybe you are looking for