Data conversion error

Hi All,
I am getting below error in my SQL server agent job.
 Source: Data Flow Task Data Conversion [31]     Description: Data conversion failed while converting column "Partner" (935) to column "Copy of Partner" (940).  The conversion returned status value 4 and status text "Text
was truncated or one or more characters had no match in the target code page.".  End Error  Error: 2013-09-25 15:16:31.32     Code: 0xC020902A     Source: Data Flow Task Data Conversion [31]     Description: The
"output column "Copy of Partner" (940)" failed because truncation occurred, and the truncation row disposition on "output column "Copy of Partner" (940)" specifies failure on truncation. A truncation error occurred on
the specified object of the specified component.  End Error  Error: 2013-09-25 15:16:31.33     Code: 0xC0047022     Source: Data Flow Task SSIS.Pipeline     Description: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The
ProcessInput method on component "Data Conversion" (31) failed with error code 0xC020902A while processing input "Data Conversion Input" (32). The identified component returned an error from the ProcessInput method. The error is specific
to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.  End Error  Error: 2013-09-25 15:16:31.35     Code:
0xC02020C4     Source: Data Flow Task Excel Source [1]     Description: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.  End Error  Error: 2013-09-25 15:16:31.36     Code: 0xC0047038
    Source: Data Flow Task SSIS.Pipeline     Description: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "Excel Source" (1) returned error code 0xC02020C4.  The component returned a failure
code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the
failure.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  3:10:23 PM  Finished: 3:16:38 PM  Elapsed:  374.184 seconds.  The package execution failed.  The step failed.
Kindly help me to debug this

Possible cause of error: Source has more characters than what is allowed by the conversion.
i.e. Conversion allows DT_STR(50) (50 characters) but a source record has more than 50 characters for that column.
Possible solution: increase the conversion character limit if possible. Or, under Error Outputs > Truncation > switch from Fail Component to Redirect Row or Ignore failure (you will then have to decide what you wish to do for such issue). Or, make
sure the source can not violate this character limit.

Similar Messages

  • Data Conversion Errors for the last week

    We've been running a simple Stream Analytics job for little over a month now with a very light workload. Input is Event hub and output SQL Server. We noticed today that we haven't received anything into SQL Server since 2014-12-08 (we don't receive events
    every day so we only know that everything still worked on the 8th of December), so we checked the job's logs. It seems that job is failing to process all the messages: The value of "Data Conversion Errors" is high.
    I wonder what could have happened? We haven't touched the client since we started the job so it's still sending the messages in same format. And we haven't touched the job's query either.
    Has there been an update to either to Stream Analytics or to Events Hub which could cause the issue we're seeing?

    I've followed word for word the TollApp Instructions (except the thing with NamespaceType "Messaging" that has been added to New-AzureSBNamespace).
    I have 0 line in output, and this is the service log:
    Correlation ID:
    e94f5b9e-d755-4160-b49e-c8225ceced0c
    Error:
    Message:
    After deserialization, 0 rows have been found. Possible reasons could be a missing header or malformed CSV input.
    Message Time:
    2015-01-21 10:35:15Z
    Microsoft.Resources/EventNameV2:
    sharedNode92F920DE-290E-4B4C-861A-F85A4EC01D82.entrystream_0_c76f7247_25b7_4ca6_a3b6_c7bf192ba44a#0.output
    Microsoft.Resources/Operation:
    Information
    Microsoft.Resources/ResourceUri:
    /subscriptions/eb880f80-0028-49db-b956-464f8439270f/resourceGroups/StreamAnalytics-Default-West-Europe/providers/Microsoft.StreamAnalytics/streamingjobs/TollData
    Type:
    CsvParserError
    Then I stopped the job, and connected to the event hub with a console app and received that:
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    85,21/01/2015 10:24:56,QBQ 1188,OR,Toyota,4x4,1,0,4,361203677
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    33,21/01/2015 10:25:42,BSE 3166,PA,Toyota,Rav4,1,0,6,603558073
    Message received. Partition: '11', Data: 'TollId,EntryTime,LiMessage received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    59,21/01/2015 10:23:59,AXD 1469,CA,Toyota,Camry,1,0,6,150568526
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    25,21/01/2015 10:24:17,OLW 6671,NJ,Honda,Civic,1,0,5,729503344
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    51,21/01/2015 10:24:23,LTV 6699,CA,Honda,CRV,1,0,5,169341662
    Note the bug on the 3rd message. In my opinion it's unrelated, it could be the writeline that can't keep up with the stream in the console application. And at worst it's in the stream, but then I should see at least some lines in output for the correctly
    formatted messages.

  • Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).

    Hi,
    I have a file where fields are wrapped with ".
    =========== file sample
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    ==========
    I am having a .net method to remove the wrap characters and write out a file without wrap characters.
    ======================
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    ======================
    the .net code is here.
    ========================================
    public static string RemoveCharacter(string sFileName, char cRemoveChar)
                object objLock = new object();
                //VirtualStream objInputStream = null;
                //VirtualStream objOutStream = null;
                FileStream objInputFile = null, objOutFile = null;
                lock(objLock)
                    try
                        objInputFile = new FileStream(sFileName, FileMode.Open);
                        //objInputStream = new VirtualStream(objInputFile);
                        objOutFile = new FileStream(sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString(), FileMode.Create);
                        //objOutStream = new VirtualStream(objOutFile);
                        int nByteRead;
                        while ((nByteRead = objInputFile.ReadByte()) != -1)
                            if (nByteRead != (int)cRemoveChar)
                                objOutFile.WriteByte((byte)nByteRead);
                    finally
                        objInputFile.Close();
                        objOutFile.Close();
                    return sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString();
    ==================================
    however when I run the bulk load utility I get the error 
    =======================================
    Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).
    ==========================================
    the bulk insert statement is as follows
    =========================================
     BULK INSERT Temp  
     FROM '<file name>' WITH  
      FIELDTERMINATOR = ','  
      , KEEPNULLS  
    ==========================================
    Does anybody know what is happening and what needs to be done ?
    PLEASE HELP
    Thanks in advance 
    Vikram

    To load that file with BULK INSERT, use this format file:
    9.0
    4
    1 SQLCHAR 0 0 "\""      0 ""    ""
    2 SQLCHAR 0 0 "\",\""   1 col1  Latin1_General_CI_AS
    3 SQLCHAR 0 0 "\",\""   2 col2  Latin1_General_CI_AS
    4 SQLCHAR 0 0 "\"\r\n"  3 col3  Latin1_General_CI_AS
    Note that the format file defines four fields while the fileonly seems to have three. The format file defines an empty field before the first quote.
    Or, since you already have a .NET program, use a stored procedure with table-valued parameter instead. I have an example of how to do this here:
    http://www.sommarskog.se/arrays-in-sql-2008.html
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Most simple query on Event Hub stream (json) constantly gives Data Conversion Errors

    Hello all,
    Been playing with ASA in December and didn't have any issues, my queries kept working and outputted the data as needed.  However, since January, I created a new demo, where I now constantly get Data Conversion errors.  The scenario is described
    below, but I have the following questions:
    Where can I get detailed information on the data conversion errors?  I don't get any point now (not in the operation logs and not in the table storage of my diagnostic storage account)
    What could be wrong in my scenario and could be causing these issues
    The scenario I have implemented is the following:
    My local devices send EventData objects, serialized through Json.Net to an Event Hub with 32 partitions.
    I define my query input as Event Hub Stream and define the data as json/utf8.  I give it the name TelemetryReadings
    Then I write my query as SELECT * FROM TelemetryReadings
    In the output, I create an output on blob with CSV/UTF8 encoding
    After that, I start the job
    The result is an empty blob container (no output written) and tons of data conversion errors in the monitoring graph.  What should I do to get this solved?
    Thanks
    Sam Vanhoutte - CTO Codit - VTS-P BizTalk - Windows Azure Integration: www.integrationcloud.eu

    So, apparently the issue was related to the incoming objects, I had.  I was sending unsupported data types (boolean and Dictionary).  I changed my code to remove these from the json and that worked out well.  There was a change that got deployed
    that (instead of marking the unsupported fields as null, they were throwing an exception).  That's why things worked earlier.
    So, it had to do with the limitation that I mentioned in my earlier comment:
    https://github.com/Azure/azure-content/blob/master/articles/stream-analytics-limitations.md
    Unsupported type conversions result in NULL values
    Any event vales with type conversions not supported in the Data Types section of Azure Stream Analytics Query Language
    Reference will result in a NULL value. In this preview release no error logging is in place for these conversion exceptions.
    I am creating a blog post on this one
    Sam Vanhoutte - CTO Codit - VTS-P BizTalk - Windows Azure Integration: www.integrationcloud.eu

  • Data conversion error. Overflowed the specific type.

    Hi guys, as usual I am struggling with the data conversion. I have got a column BS as Float in SQL, the usual values is 10.5656445, 899.66552366 etc. I'll try to move in Excel the data but I wish to have only one decimal. I try decimal in the data conversion
    with precision 1 but it returns me the same value now I am trying
    numeric, precision 1 but it returns the error "Conversion failed because the data value overflowed the specified type". Any suggestion? Thanks

    Could you tell me what should be the result for 10.5656445, 899.66552366?  Is that 10.5, 899.6 ?
    If yes then, set Precision: 4, Scale: 1
    (This is a total of 4 digits, 3 digits for the whole number and 1 for the fractional)
    Cheers,
    Vaibhav Chaudhari

  • Data Conversion Error Message

    I am getting the following error when running a report in Crystal Reports 2008:
    'Database Connection Error: '42000:[Microsoft][ODBC SQL Server Driver][SQL Server]Error converting data type varchar to datetime. [Database Vendor Code: 8114 ]'
    I can run my stored procedure without error in SQL Server.  When I add the stored procedure to a new, simple Crystal Report, it runs without error and data is returned.  The Crystal Report that is getting the error has been converted from Crystal Reports 8 to Crystal Reports 2008.  There has to be something in the way the report is passing data to the stored procedure that is causing the issue.  Any ideas?

    Hi, 
    After you open the report in CR2008, go to Database | Verify Database.  You should get prompted again for the parameters.  Enter them and try running the report. 
    There is a big difference in how CR8 and 2008 work with SQL.  Verifying the database usually clears these issues up.  If it doesn't fix, try browsing each field datetime to see which field Crystal is seeing as character. 
    Good luck,
    Brian

  • Data Conversion errors in loading

    Hi,
    I am trying to load the data in to a cube using generic extraction data source based on a table. 
    The following error occured while loading:
    'Content KG from field ZREFQTY not convertible in type QUAN-->see long text'
    'Content INR from field ZREFVAL not convertible in type CURR-->see long text'
    but the data arrived with errors in PSA.
    the errors are occured in both the reference fields. i.e. 0unit, 0currency.
    thanks
    venkat

    Venkat,
    First, make sure you're not mapping 0unit to ZREFQTY and 0currency to ZREFVAL... I know it's silly but I saw it before...
    Check the definition of Key Figure ZREFQTY and make sure you have "0UNIT" set as "Unit/Currency"... Same for ZREFVAL, make sure it has "0CURRENCY" as "Unit/Currency".
    The last check would be to make sure that the fields you're mapping from the Datasource, to ZREFQTY and ZREFVAL, have the same data types in their definitions (just scroll to the right in the Transfer rules and you'll see their Type and Length).
    Regards,
    Luis

  • Date conversion error

    I'm trying to calculate the dif between two dates and display it on the report. I have a table that i'm pulling my date from which is Audit_log_dim.AU_time tabl, and in that table the format of the date is 1/14/2009. I am using the current date function to pull the current date, and that format is the same. I created a formula that would subtract the MAX(Audit_log_dim.AU_time) - CURRENT_DATE. This formula is giving me an error. Please see the error below. Can somebody tell me what I am doing wrong?
    !http://i49.tinypic.com/22bzwz.jpg!

    Vishwam wrote:
    Hi user1671409,
    if i understood correct , at dashboard level '@{AsOnDate}" will be returning something other than 'RRRR-MM-DD' format.
    Can you please post the error message.
    Regards
    VishwamHi Vishwam,
    please find the error im getting in the dashboard.
    Error codes:OPR4ONWY U9IM8TAC OI2DL65P
    *state:HY000 code:[nQS Error: 10058] A general error has occured [nQS Error: 17001]Oracle Error code:1830 message:ORA-01830 :date format picture ends before converting entire input string at OCI call OCIStmtExecute.*

  • Handle Date conversion error without function

    Hi,
    Database version 12.1 - though I would also like to hear if the solution is possible for 9i DB too, as I am pulling data from a view over a DB Link, and would prefer to have the date fix within the view to having to recreate the view logic in my datawarehouse.
    A small illustrative sample of what I want.
    Sample Data
    create table descriptors (attribute1 varchar2(25), creation_date date);
    insert into table descriptors (attribute1, creation_date) values ('010612',sysdate);
    insert into table descriptors (attribute1, creation_date) values ('310612',sysdate);
    commit;And the result I am after, in pseudo code....
    select case when to_date(attribute1,'DDMMYY') = error then creation_date else to_date(attribute1,'DDMMYY') end as safe_date from descriptorsYes - I know this will not work!!
    And yes, I know storing dates as text is not best practise, this is drill down data from Oracle e-business feeder files, decisions that were made before I had any influence over them - so I am having to make the best of a bad job.
    thanks for your input,
    Robert.

    Etbin wrote:
    select case when case when to_number(substr(attribute1,3,2)) not between 1 and 12
    then 'error'
    when to_number(substr(attribute1,1,2)) not between 1 and to_number(to_char(last_day(to_date(substr(attribute1,3),'mmrr')),'dd'))
    Neat solution !
    I'll remember that.
    But I fail to understand why the 2 embedded case ?
    Should this be enough ?with
    descriptors(attribute1,creation_date) as
    (select '010612',trunc(sysdate) from dual union all
    select '110612',trunc(sysdate) from dual union all
    select '210612',trunc(sysdate) from dual union all
    select '310612',trunc(sysdate) from dual
    select case when to_number(substr(attribute1,3,2)) not between 1 and 12
                          then creation_date
                          when to_number(substr(attribute1,1,2)) not between 1 and to_number(to_char(last_day(to_date(substr(attribute1,3),'mmrr')),'dd'))
                          then creation_date
                else to_date(attribute1,'ddmmrr')
           end checked
      from descriptors
    /

  • Crystal Data Conversion Issue (Error converting data type varchar to datetime)

    Hi,
    I can run stored procedure without error in SQL Server using my personal credentials as well as database credentials.
    I can also run Crystal Report after connecting to Stored procedure without error on my desktop using my personal credentials as well as database credentials.
    But when I upload the crystal report in BOBJDEV and when I run using database credentials report fails saying that "Error in File ~tmp1d1480b8e70fd90.rpt: Unable to connect: incorrect log on parameters. Details: [Database Vendor Code: 18456 ]" but I can run the crystal report successfully on BOBJDEV using my personal credentials.
    I googled (Data Conversion Error Message) about this issue & lot of people asked to do "Verify Database" in Crystal Report. So I did that, but when I do it I am getting a error message like this:
    Error converting data type varchar to datetime.
    Where do you think the error might be occurring? Did anyone faced this kind of issue before? If so, how to resolve it?
    (FYI, I am using Crystal Reports 2008, & for stored procedure I have used SSMS 2012 )
    Please help me with this issue.
    Thanks & Regards.
    Naveen.

    hello Naveen,
    since the report works fine in the cr designer / desktop, we need to figure out where you should post this question.
    by bobjdev do you mean businessobjects enterprise or crystal reports server? if so please post this question to the bi platform space.
    -jamie

  • How to catch failed rows from excel export data conversion

    I am pulling data from SQL Server and exporting to Excel file.  Using SSIS 2008, sending to Excel 2003.  The process is working fine, and I want to grab any data conversion failures, specifically I want to grab any data that fails or is to be truncated.
    I add a flat file destination to the data conversion error line (red) and pointed it to a txt file.  This caused an error, saying some of the columns were the wrong data type to go in a text file.  So I added a data conversion to the first data
    conversion error line, but the data types wont change.  
    The wierd thing, is the error says the columns are DT_NTEXT and need to be DT_TEXT, but they aren't, they are DT_WSTR.  Anyway, I tried to convert to DT_TEXT and it caused the data conversions in my original conversion to change, whcih broke the whole
    package.
    My intention is to grab the erroring row so it can be manually converted. So how do I do that without adding 100 more errors?

    Hi teahou,
    do you really use two accounts to post on the MSDN forums?
    I think the data types were not guessed correctly by the Flat File Destination component and thus you need to adjust them using the advanced editor, then naturally the data conversion transformations become obsolete.
    Arthur
    MyBlog
    Twitter

  • Err.Description="Data access error."

    This question pertains to FDM 9.3.1, ImportAction script, StartProcess event.
    After successfully CREATEing a table and BULK INSERTING a file sans a few erroneous records using "DW.DataManipulation.fExecuteDML strSQL, True, True" or "DW.DataManipulation.fExecuteDML strSQL, False, True", I get
    Err.Number=2147217900 and Err.Description="Data access error."
    However, after explicitly defining a connection to the FDM database server and executing the same SQL against the same data, I get an error description that is much more useful for my users, Error.Number=-2147217900 and Err.Description="Bulk load data conversion error (truncation) for row x, column y (fieldname)."
    I'd rather not establish a second connection to the FDM database server if I can avoid it. Any ideas as to why each method produces a different error message? How do I get fExecuteDML to return the useful description?
    ' Connect to the FDM database server
    Set adoConn = CreateObject("ADODB.Connection")
    adoConn.ConnectionString = API.DataWindow.Connection.PstrConnection
    adoConn.CursorLocation = 3' adUseClient
    adoConn.Open
    ' Create the ADODB Command object
    Set adoComm = CreateObject("ADODB.Command")
    adoComm.ActiveConnection = adoConn
    adoComm.CommandType = 1' adCmdTable
    On Error Resume Next
    adoComm.CommandText = strSQL
    adoComm.Execute

    Larry - apologies for the incomplete response. I misunderstood your question to be "how do I change the error message that FDM displays" I assumed you were trying to modify the information bar display.
    Glad you found something that helps you.

  • How do I catch failed data conversion rows?

    I am pulling data from SQL Server and exporting to Excel file.  Using SSIS 2008, sending to Excel 2003.  The process is working fine, and I want to grab any data conversion failures, specifically I want to grab any data that fails or is to be truncated.
    I add a flat file destination to the data conversion error line (red) and pointed it to a txt file.  This caused an error, saying some of the columns were the wrong data type to go in a text file.  So I added a data conversion to the first data conversion
    error line, but the data types wont change.  
    The wierd thing, is the error says the columns are DT_NTEXT and need to be DT_TEXT, but they aren't, they are DT_WSTR.  Anyway, I tried to convert to DT_TEXT and it caused the data conversions in my original conversion to change, whcih broke the whole
    package.
    My intention is to grab the erroring row so it can be manually converted. So how do I do that without adding 100 more errors?
    Simon.

    Simon,
    There is a reasonable suspicion that the post
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/46579050-827b-4219-8e5a-fb00bfd19902/how-to-catch-failed-rows-from-excel-export-data-conversion?forum=sqlintegrationservices
    is also from you.
    So see the reply there anyway.
    Arthur
    MyBlog
    Twitter

  • Unit Conversion Error in Direct Input method for data transfer

    Hi Experts,
    I am getting a error "E MG 427: Conversion error: field BMMH6-MEINH; content PAK" When I am uploading Alt. UoM using BMHH6 structure in Direct Input. I checked value for UoM in converted data it is "PAC". I think system is internally converting it to PAK so the error is coming.
    Please let me know what need to be done to avoid this error.
    Thanks in Advance..
    -Harkamal

    Hi
    Before passing this unit to the program
    check the conversion Exit in the Domain of the Field
    and use the fun module
    CONVERSION_EXIT_ALPHA_INPUT and pass that value and see how it takes
    otherwise use the fun module UNIT_CONVERSION_SIMPLE and pass the value to program
    Regards
    Anji

  • Importing data error: SIGNEDDATA conversion error

    The complete description is:
    The number of failing rows exceeds the maximum specified. (Microsoft Data Transformation Services (DTS) Data Pump (8004202c): TransformCopy 'DTSTransformation__13' conversion error:  General conversion failure on column pair 1 (source column 'SIGNEDDATA' (DBTYPE_STR), destination column 'SIGNEDDATA' (DBTYPE_NUMERIC)).)
    We are trying to convert a value that come with miles separators, use the character "." (point) as separator and "," (comma) as decimal separator, we use a vb function (replace) to change the points for null string, but it doesn't work.  Do you have any suggestion to solve this?
    We are using BPC 7.0 MS SP3.
    Thanks in advance,
    Mariana Rodriguez.
    Edited by: Mariana Rodriguez on Mar 23, 2011 4:15 PM

    Hi Mariana,
    I've had the same kind of problem, which gave me the
    General conversion failure on column pair 1 (source column 'SIGNEDDATA' (DBTYPE_STR), destination column 'SIGNEDDATA' (DBTYPE_NUMERIC)).)
    errror message
    My issue was resolved by changing the CREDITNEGATIVE=NO option to CREDITNEGATIVE=YES, because BPC probably converted negative values as 'string' values in stead of 'numeric' when importing SAP FI data with the minus sign '-' at the end (100-) instead of SQL signeddata (-100)
    But I have to be carefull, my Account Dimension setup is as follows:
    Balance Sheet Asset => AccType=AST
    Balance Sheet Liabilities => AccType=LEQ
    Profit and Loss Revenues => AccType =INC
    Profit and Loss Expenses => AccType=INC
    My flat file seperator is a comma (,) and decimalpoint is a point (.)
    We are using BPC 7.5 MS SP07 Patch 02 and I'm not using any VBA to convert the amount.
    Maybe it can help?
    Regards,
    Peter van Drunen - FPM Solutions

Maybe you are looking for

  • Upgraded from tiger to snow leopard, can I go onto lion?

    I have recently upgraded from tiger (I think) to snow leopard, can I go onto lion? I have a imac 17" 2 GHz Intel Core 2 Duo, 1 GB 667 MHz DDR2 SDRAM, I think I bought it in 2007, I don't use it for much but a bit of browsing storing photos and music

  • HP Officejet 6600 e-All-in-O​ne Wireless connection issue.

    We are using the HP 6600 now for 6 months and sice a few weeks we have some issues with the printer. situation: 1 - HP 6600 printer 1 - Wireless Router 3 - Computers connected to the router What we did: Updated the firmware Manual IP Put channel from

  • ATI Radeon X1800 as PCI-e, when?

    Hi, Does anybody know some details, when the ATI Radeon X1800 for Mac in PCI-e will be available? ATI showed it on MacWorld this year. JO

  • ActivationAgent in bpel.xml and Quartz Scheduler

    Hi, i have added to bpel.xml to schedule my BPEL process. <activationAgent className="oracle.tip.adapter.fw.agent.jca.JCAActivationAgent" partnerLink="FileFtpInboundPL" heartBeatInterval="10"> <property name="schedulerCallout">DefaultSchedulerCallout

  • Printing document... again

    Hello, I need to build a java stored procedure to print a PDF or MSWord- or HTML-file on a given printer, so: Input: - filename (incl. location), - printer (uri) Output: - The document printed on the given printer. We still have to decide wether we'r