Data Conversion errors in loading

Hi,
I am trying to load the data in to a cube using generic extraction data source based on a table. 
The following error occured while loading:
'Content KG from field ZREFQTY not convertible in type QUAN-->see long text'
'Content INR from field ZREFVAL not convertible in type CURR-->see long text'
but the data arrived with errors in PSA.
the errors are occured in both the reference fields. i.e. 0unit, 0currency.
thanks
venkat

Venkat,
First, make sure you're not mapping 0unit to ZREFQTY and 0currency to ZREFVAL... I know it's silly but I saw it before...
Check the definition of Key Figure ZREFQTY and make sure you have "0UNIT" set as "Unit/Currency"... Same for ZREFVAL, make sure it has "0CURRENCY" as "Unit/Currency".
The last check would be to make sure that the fields you're mapping from the Datasource, to ZREFQTY and ZREFVAL, have the same data types in their definitions (just scroll to the right in the Transfer rules and you'll see their Type and Length).
Regards,
Luis

Similar Messages

  • Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).

    Hi,
    I have a file where fields are wrapped with ".
    =========== file sample
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    "asdsa","asdsadasdas","1123"
    ==========
    I am having a .net method to remove the wrap characters and write out a file without wrap characters.
    ======================
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    asdsa,asdsadasdas,1123
    ======================
    the .net code is here.
    ========================================
    public static string RemoveCharacter(string sFileName, char cRemoveChar)
                object objLock = new object();
                //VirtualStream objInputStream = null;
                //VirtualStream objOutStream = null;
                FileStream objInputFile = null, objOutFile = null;
                lock(objLock)
                    try
                        objInputFile = new FileStream(sFileName, FileMode.Open);
                        //objInputStream = new VirtualStream(objInputFile);
                        objOutFile = new FileStream(sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString(), FileMode.Create);
                        //objOutStream = new VirtualStream(objOutFile);
                        int nByteRead;
                        while ((nByteRead = objInputFile.ReadByte()) != -1)
                            if (nByteRead != (int)cRemoveChar)
                                objOutFile.WriteByte((byte)nByteRead);
                    finally
                        objInputFile.Close();
                        objOutFile.Close();
                    return sFileName.Substring(0, sFileName.LastIndexOf('\\')) + "\\" + Guid.NewGuid().ToString();
    ==================================
    however when I run the bulk load utility I get the error 
    =======================================
    Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 3 (NumberOfMultipleMatches).
    ==========================================
    the bulk insert statement is as follows
    =========================================
     BULK INSERT Temp  
     FROM '<file name>' WITH  
      FIELDTERMINATOR = ','  
      , KEEPNULLS  
    ==========================================
    Does anybody know what is happening and what needs to be done ?
    PLEASE HELP
    Thanks in advance 
    Vikram

    To load that file with BULK INSERT, use this format file:
    9.0
    4
    1 SQLCHAR 0 0 "\""      0 ""    ""
    2 SQLCHAR 0 0 "\",\""   1 col1  Latin1_General_CI_AS
    3 SQLCHAR 0 0 "\",\""   2 col2  Latin1_General_CI_AS
    4 SQLCHAR 0 0 "\"\r\n"  3 col3  Latin1_General_CI_AS
    Note that the format file defines four fields while the fileonly seems to have three. The format file defines an empty field before the first quote.
    Or, since you already have a .NET program, use a stored procedure with table-valued parameter instead. I have an example of how to do this here:
    http://www.sommarskog.se/arrays-in-sql-2008.html
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Getting Duplicate data Records error while loading the Master data.

    Hi All,
    We are getting Duplicate data Records error while loading the Profit centre Master data. Master data contains time dependent attributes.
    the load is direct update. So i made it red and tried to reloaded from PSA even though it is throwing same error.
    I checked in PSA. Showing red which records have same Profit centre.
    Could any one give us any suggestions to resolve the issues please.
    Thanks & Regards,
    Raju

    Hi Raju,
            I assume there are no routines written in the update rules and also you ae directly loading the data from R/3 ( not from any ODS). If that is the case then it could be that, the data maintained in R/3 has overlapping time intervals (since  time dependency of attributes is involved). Check your PSA if the same profit center has time intervals which over lap. In that case, you need to get this fixed in R/3. If there are no overlapping time intervals, you can simply increase the error tolerance limit in your info-package and repeat the load.
    Hope this helps you.
    Thanks & Regards,
    Nithin Reddy.

  • Data Conversion Errors for the last week

    We've been running a simple Stream Analytics job for little over a month now with a very light workload. Input is Event hub and output SQL Server. We noticed today that we haven't received anything into SQL Server since 2014-12-08 (we don't receive events
    every day so we only know that everything still worked on the 8th of December), so we checked the job's logs. It seems that job is failing to process all the messages: The value of "Data Conversion Errors" is high.
    I wonder what could have happened? We haven't touched the client since we started the job so it's still sending the messages in same format. And we haven't touched the job's query either.
    Has there been an update to either to Stream Analytics or to Events Hub which could cause the issue we're seeing?

    I've followed word for word the TollApp Instructions (except the thing with NamespaceType "Messaging" that has been added to New-AzureSBNamespace).
    I have 0 line in output, and this is the service log:
    Correlation ID:
    e94f5b9e-d755-4160-b49e-c8225ceced0c
    Error:
    Message:
    After deserialization, 0 rows have been found. Possible reasons could be a missing header or malformed CSV input.
    Message Time:
    2015-01-21 10:35:15Z
    Microsoft.Resources/EventNameV2:
    sharedNode92F920DE-290E-4B4C-861A-F85A4EC01D82.entrystream_0_c76f7247_25b7_4ca6_a3b6_c7bf192ba44a#0.output
    Microsoft.Resources/Operation:
    Information
    Microsoft.Resources/ResourceUri:
    /subscriptions/eb880f80-0028-49db-b956-464f8439270f/resourceGroups/StreamAnalytics-Default-West-Europe/providers/Microsoft.StreamAnalytics/streamingjobs/TollData
    Type:
    CsvParserError
    Then I stopped the job, and connected to the event hub with a console app and received that:
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    85,21/01/2015 10:24:56,QBQ 1188,OR,Toyota,4x4,1,0,4,361203677
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    33,21/01/2015 10:25:42,BSE 3166,PA,Toyota,Rav4,1,0,6,603558073
    Message received. Partition: '11', Data: 'TollId,EntryTime,LiMessage received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    59,21/01/2015 10:23:59,AXD 1469,CA,Toyota,Camry,1,0,6,150568526
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    25,21/01/2015 10:24:17,OLW 6671,NJ,Honda,Civic,1,0,5,729503344
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    51,21/01/2015 10:24:23,LTV 6699,CA,Honda,CRV,1,0,5,169341662
    Note the bug on the 3rd message. In my opinion it's unrelated, it could be the writeline that can't keep up with the stream in the console application. And at worst it's in the stream, but then I should see at least some lines in output for the correctly
    formatted messages.

  • Most simple query on Event Hub stream (json) constantly gives Data Conversion Errors

    Hello all,
    Been playing with ASA in December and didn't have any issues, my queries kept working and outputted the data as needed.  However, since January, I created a new demo, where I now constantly get Data Conversion errors.  The scenario is described
    below, but I have the following questions:
    Where can I get detailed information on the data conversion errors?  I don't get any point now (not in the operation logs and not in the table storage of my diagnostic storage account)
    What could be wrong in my scenario and could be causing these issues
    The scenario I have implemented is the following:
    My local devices send EventData objects, serialized through Json.Net to an Event Hub with 32 partitions.
    I define my query input as Event Hub Stream and define the data as json/utf8.  I give it the name TelemetryReadings
    Then I write my query as SELECT * FROM TelemetryReadings
    In the output, I create an output on blob with CSV/UTF8 encoding
    After that, I start the job
    The result is an empty blob container (no output written) and tons of data conversion errors in the monitoring graph.  What should I do to get this solved?
    Thanks
    Sam Vanhoutte - CTO Codit - VTS-P BizTalk - Windows Azure Integration: www.integrationcloud.eu

    So, apparently the issue was related to the incoming objects, I had.  I was sending unsupported data types (boolean and Dictionary).  I changed my code to remove these from the json and that worked out well.  There was a change that got deployed
    that (instead of marking the unsupported fields as null, they were throwing an exception).  That's why things worked earlier.
    So, it had to do with the limitation that I mentioned in my earlier comment:
    https://github.com/Azure/azure-content/blob/master/articles/stream-analytics-limitations.md
    Unsupported type conversions result in NULL values
    Any event vales with type conversions not supported in the Data Types section of Azure Stream Analytics Query Language
    Reference will result in a NULL value. In this preview release no error logging is in place for these conversion exceptions.
    I am creating a blog post on this one
    Sam Vanhoutte - CTO Codit - VTS-P BizTalk - Windows Azure Integration: www.integrationcloud.eu

  • Data conversion error

    Hi All,
    I am getting below error in my SQL server agent job.
     Source: Data Flow Task Data Conversion [31]     Description: Data conversion failed while converting column "Partner" (935) to column "Copy of Partner" (940).  The conversion returned status value 4 and status text "Text
    was truncated or one or more characters had no match in the target code page.".  End Error  Error: 2013-09-25 15:16:31.32     Code: 0xC020902A     Source: Data Flow Task Data Conversion [31]     Description: The
    "output column "Copy of Partner" (940)" failed because truncation occurred, and the truncation row disposition on "output column "Copy of Partner" (940)" specifies failure on truncation. A truncation error occurred on
    the specified object of the specified component.  End Error  Error: 2013-09-25 15:16:31.33     Code: 0xC0047022     Source: Data Flow Task SSIS.Pipeline     Description: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The
    ProcessInput method on component "Data Conversion" (31) failed with error code 0xC020902A while processing input "Data Conversion Input" (32). The identified component returned an error from the ProcessInput method. The error is specific
    to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.  End Error  Error: 2013-09-25 15:16:31.35     Code:
    0xC02020C4     Source: Data Flow Task Excel Source [1]     Description: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.  End Error  Error: 2013-09-25 15:16:31.36     Code: 0xC0047038
        Source: Data Flow Task SSIS.Pipeline     Description: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "Excel Source" (1) returned error code 0xC02020C4.  The component returned a failure
    code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the
    failure.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  3:10:23 PM  Finished: 3:16:38 PM  Elapsed:  374.184 seconds.  The package execution failed.  The step failed.
    Kindly help me to debug this

    Possible cause of error: Source has more characters than what is allowed by the conversion.
    i.e. Conversion allows DT_STR(50) (50 characters) but a source record has more than 50 characters for that column.
    Possible solution: increase the conversion character limit if possible. Or, under Error Outputs > Truncation > switch from Fail Component to Redirect Row or Ignore failure (you will then have to decide what you wish to do for such issue). Or, make
    sure the source can not violate this character limit.

  • Data conversion error. Overflowed the specific type.

    Hi guys, as usual I am struggling with the data conversion. I have got a column BS as Float in SQL, the usual values is 10.5656445, 899.66552366 etc. I'll try to move in Excel the data but I wish to have only one decimal. I try decimal in the data conversion
    with precision 1 but it returns me the same value now I am trying
    numeric, precision 1 but it returns the error "Conversion failed because the data value overflowed the specified type". Any suggestion? Thanks

    Could you tell me what should be the result for 10.5656445, 899.66552366?  Is that 10.5, 899.6 ?
    If yes then, set Precision: 4, Scale: 1
    (This is a total of 4 digits, 3 digits for the whole number and 1 for the fractional)
    Cheers,
    Vaibhav Chaudhari

  • Date Format  error while loading *.dat  file to Sataging .

    Hi All ,
    Loading data from *.dat file in to Staging , the dat file contains data for date in 'MM-DD-YYYY' format which is String , when it loading by control file it produce the error date format is not supported .I think ti required DD-MM-YYYY format . Is their any properties to change the date format or else need to write the function for converting date format .
    I am using OWb 10g Release 1 , with same DB .
    Please guide me
    Thanx in advance
    Regards ,

    Hi,
    Before loading into staging, you can use Expression operator to convert the CHAR into DATE with something like this:
    TO_DATE('01-02-2005','MM-DD-YYYY')
    Hope this helps.

  • Date format error in loading Programme

    Hi all,
    I am getting the error as I_BLDAT invalid date (ORA-01840: input value not long enough for date format) but some how in loading program I am getting the date as . .0 instead of 00000000. can someone please help me where it went wrong.These are the lines involing BLDAT
    CONCATENATE:
             wa_zfrcptp-bldat+6(2) wa_zfrcptp-bldat+4(2)
             wa_zfrcptp-bldat+0(4) INTO l_lla_dtl_rec-pay_date,
    wa_zfrcptp-bldat        TO l_huon_crr_rec-receipt_date.
    wa_zfrcptp-bldat        TO l_huon_ipr_rec-receipt_date.
    CONCATENATE wa_zfrcptp-bldat+6(2)
                    wa_zfrcptp-bldat+4(2)
                    wa_zfrcptp-bldat+0(4)
                               INTO l_cal_dtl_rec-receipt_date
    bldat_i(8)           TYPE c,
    l_cid_dtl_rec-bldat_i     = wa_zfrcptp-bldat.
    Thanks in advance,
    Rishik.
    Edited by: Rishik on Apr 16, 2009 8:44 AM

    HI,
    If you are using the BDC for loading data..then while passing BLDAT to the BDC
    Use the WRITE statement to convert the date to user format and pass to BDC
    l_endda1 is of type char with length 10. endda type sy-datum.
    WRITE i_input_line-endda TO l_endda1.
        PERFORM bdc_field       USING 'RP50G-ENDDA'    l_endda1.

  • UoM conversion error while loading Material Master

    Hi,
    Our Legacy system is also SAP (lower version) and we're trying to move the data to ECC6.0. We're using MATMAS_MASS_BAPI03 to move Material master records.
    All the Alternative UoM to be moved with 'no change'. Everything gets moved except for LBs and FT3s and we get message 'UoM Conversions are inconsistent'.
    I entered LBs, for example, online in the new system and found that conversion factors (NUmerator and Denominator) were indeed different.
    Legacy LB:  Numerator - 4756 and Denominator - 65113
    New System LB: Numerator - 2564 and Denominator - 35103
    If you have noticed, we will get the same result if you divide Numerator by Denominator.
    Note: KG values were same in the both systems without any issue.
    What should we do to get it right?
    Best Regards,
    Ram

    Hi,
    While loading attributes U have to select the transfer structure of the 0MATERIAL_ATTR then activate it..now proceed if it is already replicated in to BW system... other wise first do that..then go ahead....Similar to Text also...
    Regards
    Siddhu

  • Data Conversion Error Message

    I am getting the following error when running a report in Crystal Reports 2008:
    'Database Connection Error: '42000:[Microsoft][ODBC SQL Server Driver][SQL Server]Error converting data type varchar to datetime. [Database Vendor Code: 8114 ]'
    I can run my stored procedure without error in SQL Server.  When I add the stored procedure to a new, simple Crystal Report, it runs without error and data is returned.  The Crystal Report that is getting the error has been converted from Crystal Reports 8 to Crystal Reports 2008.  There has to be something in the way the report is passing data to the stored procedure that is causing the issue.  Any ideas?

    Hi, 
    After you open the report in CR2008, go to Database | Verify Database.  You should get prompted again for the parameters.  Enter them and try running the report. 
    There is a big difference in how CR8 and 2008 work with SQL.  Verifying the database usually clears these issues up.  If it doesn't fix, try browsing each field datetime to see which field Crystal is seeing as character. 
    Good luck,
    Brian

  • Date conversion error

    I'm trying to calculate the dif between two dates and display it on the report. I have a table that i'm pulling my date from which is Audit_log_dim.AU_time tabl, and in that table the format of the date is 1/14/2009. I am using the current date function to pull the current date, and that format is the same. I created a formula that would subtract the MAX(Audit_log_dim.AU_time) - CURRENT_DATE. This formula is giving me an error. Please see the error below. Can somebody tell me what I am doing wrong?
    !http://i49.tinypic.com/22bzwz.jpg!

    Vishwam wrote:
    Hi user1671409,
    if i understood correct , at dashboard level '@{AsOnDate}" will be returning something other than 'RRRR-MM-DD' format.
    Can you please post the error message.
    Regards
    VishwamHi Vishwam,
    please find the error im getting in the dashboard.
    Error codes:OPR4ONWY U9IM8TAC OI2DL65P
    *state:HY000 code:[nQS Error: 10058] A general error has occured [nQS Error: 17001]Oracle Error code:1830 message:ORA-01830 :date format picture ends before converting entire input string at OCI call OCIStmtExecute.*

  • Handle Date conversion error without function

    Hi,
    Database version 12.1 - though I would also like to hear if the solution is possible for 9i DB too, as I am pulling data from a view over a DB Link, and would prefer to have the date fix within the view to having to recreate the view logic in my datawarehouse.
    A small illustrative sample of what I want.
    Sample Data
    create table descriptors (attribute1 varchar2(25), creation_date date);
    insert into table descriptors (attribute1, creation_date) values ('010612',sysdate);
    insert into table descriptors (attribute1, creation_date) values ('310612',sysdate);
    commit;And the result I am after, in pseudo code....
    select case when to_date(attribute1,'DDMMYY') = error then creation_date else to_date(attribute1,'DDMMYY') end as safe_date from descriptorsYes - I know this will not work!!
    And yes, I know storing dates as text is not best practise, this is drill down data from Oracle e-business feeder files, decisions that were made before I had any influence over them - so I am having to make the best of a bad job.
    thanks for your input,
    Robert.

    Etbin wrote:
    select case when case when to_number(substr(attribute1,3,2)) not between 1 and 12
    then 'error'
    when to_number(substr(attribute1,1,2)) not between 1 and to_number(to_char(last_day(to_date(substr(attribute1,3),'mmrr')),'dd'))
    Neat solution !
    I'll remember that.
    But I fail to understand why the 2 embedded case ?
    Should this be enough ?with
    descriptors(attribute1,creation_date) as
    (select '010612',trunc(sysdate) from dual union all
    select '110612',trunc(sysdate) from dual union all
    select '210612',trunc(sysdate) from dual union all
    select '310612',trunc(sysdate) from dual
    select case when to_number(substr(attribute1,3,2)) not between 1 and 12
                          then creation_date
                          when to_number(substr(attribute1,1,2)) not between 1 and to_number(to_char(last_day(to_date(substr(attribute1,3),'mmrr')),'dd'))
                          then creation_date
                else to_date(attribute1,'ddmmrr')
           end checked
      from descriptors
    /

  • Crystal Data Conversion Issue (Error converting data type varchar to datetime)

    Hi,
    I can run stored procedure without error in SQL Server using my personal credentials as well as database credentials.
    I can also run Crystal Report after connecting to Stored procedure without error on my desktop using my personal credentials as well as database credentials.
    But when I upload the crystal report in BOBJDEV and when I run using database credentials report fails saying that "Error in File ~tmp1d1480b8e70fd90.rpt: Unable to connect: incorrect log on parameters. Details: [Database Vendor Code: 18456 ]" but I can run the crystal report successfully on BOBJDEV using my personal credentials.
    I googled (Data Conversion Error Message) about this issue & lot of people asked to do "Verify Database" in Crystal Report. So I did that, but when I do it I am getting a error message like this:
    Error converting data type varchar to datetime.
    Where do you think the error might be occurring? Did anyone faced this kind of issue before? If so, how to resolve it?
    (FYI, I am using Crystal Reports 2008, & for stored procedure I have used SSMS 2012 )
    Please help me with this issue.
    Thanks & Regards.
    Naveen.

    hello Naveen,
    since the report works fine in the cr designer / desktop, we need to figure out where you should post this question.
    by bobjdev do you mean businessobjects enterprise or crystal reports server? if so please post this question to the bi platform space.
    -jamie

  • Err.Description="Data access error."

    This question pertains to FDM 9.3.1, ImportAction script, StartProcess event.
    After successfully CREATEing a table and BULK INSERTING a file sans a few erroneous records using "DW.DataManipulation.fExecuteDML strSQL, True, True" or "DW.DataManipulation.fExecuteDML strSQL, False, True", I get
    Err.Number=2147217900 and Err.Description="Data access error."
    However, after explicitly defining a connection to the FDM database server and executing the same SQL against the same data, I get an error description that is much more useful for my users, Error.Number=-2147217900 and Err.Description="Bulk load data conversion error (truncation) for row x, column y (fieldname)."
    I'd rather not establish a second connection to the FDM database server if I can avoid it. Any ideas as to why each method produces a different error message? How do I get fExecuteDML to return the useful description?
    ' Connect to the FDM database server
    Set adoConn = CreateObject("ADODB.Connection")
    adoConn.ConnectionString = API.DataWindow.Connection.PstrConnection
    adoConn.CursorLocation = 3' adUseClient
    adoConn.Open
    ' Create the ADODB Command object
    Set adoComm = CreateObject("ADODB.Command")
    adoComm.ActiveConnection = adoConn
    adoComm.CommandType = 1' adCmdTable
    On Error Resume Next
    adoComm.CommandText = strSQL
    adoComm.Execute

    Larry - apologies for the incomplete response. I misunderstood your question to be "how do I change the error message that FDM displays" I assumed you were trying to modify the information bar display.
    Glad you found something that helps you.

Maybe you are looking for