Data Conversion issue

Hello I'm new to Oracle DB and still learn SQL.
I have a problem with data conversion in a CASE expression.
Example:
select ... ,     
CASE column WHEN 1 THEN
select max(column)
from table
where ...
ELSE 0 END AS value,
from     ...
where ...
order by ...
What I'm basically intend to do is to select a value from another table depending on the value of
a special column.
The problem now is that the
select max(column)
from table
where ...
statement in its original form selects the right value as float (e.g. 0.24)which I need.
But the same statement embedded in the CASE expression returns (e.g. 0 ) which seems to be the same result converted to an integer or something.
I tried to convert the result in the CASE expression but without success.
Does anybody know what could cause this behavior and how to fix it ?
Thanks for any help with this.

Welcome to the forum!
Whenever you have a problem, post a complete script that people can run to re-create the problem. Include CREATE TABLE and INSERT statements for a little sample data if necessary. (A lot of folks show the problem using commonly available tables, such as those in the scott or hr schemas, so they don't have to post any sample data.)
if your problem involves an error message, post the complete error message you're getting, including line numbers.
Always say which version of Oracle you're using, e.g. 10.2.0.1.0.
Sorry, I can't re-create the problem.
I did something as similar as I could to what you posted:
SELECT       deptno
,       CASE deptno
          WHEN  10
          THEN  (
               SELECT  MAX (sal)
               FROM     scott.emp
               WHERE     deptno     = d.deptno
          ELSE  0
       END          AS value
,       dname
FROM       scott.dept     d
WHERE       deptno     <= 30
ORDER BY  deptno
;And it ran, producing this output:
`   DEPTNO      VALUE DNAME
        10       5000 ACCOUNTING
        20          0 RESEARCH
        30          0 SALES
886562 wrote:Hello I'm new to Oracle DB and still learn SQL.I've been using Oracle for 20 years, and still learn SQL.
select ... ,     
CASE column WHEN 1 THEN
select max(column)
from table
where ...
ELSE 0 END AS value,
The problem now is that the
select max(column)
from table
where ...
statement in its original form selects the right value as float (e.g. 0.24)which I need.
But the same statement embedded in the CASE expression returns (e.g. 0 ) which seems to be the same result converted to an integer or something.Are you sure that 0 is being returned by the sub-query in the THEN clause, and not by the ELSE clause?
I tried to convert the result in the CASE expression but without success.What did you try? Post your code.
What problem did you have? Post the error message.
Edited by: Frank Kulash on Sep 21, 2011 6:15 AM

Similar Messages

  • Crystal Data Conversion Issue (Error converting data type varchar to datetime)

    Hi,
    I can run stored procedure without error in SQL Server using my personal credentials as well as database credentials.
    I can also run Crystal Report after connecting to Stored procedure without error on my desktop using my personal credentials as well as database credentials.
    But when I upload the crystal report in BOBJDEV and when I run using database credentials report fails saying that "Error in File ~tmp1d1480b8e70fd90.rpt: Unable to connect: incorrect log on parameters. Details: [Database Vendor Code: 18456 ]" but I can run the crystal report successfully on BOBJDEV using my personal credentials.
    I googled (Data Conversion Error Message) about this issue & lot of people asked to do "Verify Database" in Crystal Report. So I did that, but when I do it I am getting a error message like this:
    Error converting data type varchar to datetime.
    Where do you think the error might be occurring? Did anyone faced this kind of issue before? If so, how to resolve it?
    (FYI, I am using Crystal Reports 2008, & for stored procedure I have used SSMS 2012 )
    Please help me with this issue.
    Thanks & Regards.
    Naveen.

    hello Naveen,
    since the report works fine in the cr designer / desktop, we need to figure out where you should post this question.
    by bobjdev do you mean businessobjects enterprise or crystal reports server? if so please post this question to the bi platform space.
    -jamie

  • Oracle Database : Date conversion issue between timezones

    Hi All,
    We are trying to convert date from Eurpoer/Amsterdam timezone to Australia/Sydnet timezone and extracting time out of it.
    We are facing issue related to incorrect timing after conversion.
    Please find below issue details,
    Environment
    Database: Oracle 10.2.0.4
    Machine: Linux RHEL 4
    Location: Amsterdam, Netherlands
    Issue: After converting date from Eurpoer/Amsterdam timezone to Australia/Sydnet timezone
    SELECT TO_CHAR (FROM_TZ (TO_TIMESTAMP ('201110201416', 'YYYYMMDDHH24MI'),
    DBTIMEZONE
    ) AT TIME ZONE 'Australia/Sydney', 'HH24:MI')
    FROM DUAL
    Output of above SQL is 22:16.
    Expected output is 23:16
    Database Timezone (DBTIMEZONE) is set to +02:00 (i.e. GMT + 2) (Europe/Amsterdam)
    If we convert date to Australia/Tasmania timezone then we get expected output i.e. 23:16.
    Question: Expected time for Australia/sydney is 23:16, but why Oracle database provides 22:16 as output.
    Please note that, Sydney and Tasmania comes under same timezone. Still we get different output for both the timezone.
    Oracle database considers GMT + 10 for Australia/sydney and GMT + 11 for Australia/Tasmania.
    If we want to use Australia/sydney then how should we get correct sydney time?
    Regards
    Shailendra

    I made a test and it showed when I omit TO_CHAR, both queries have the same result 11:15 PM. So it's TO_CHAR which is the 'culprit'. Since this is a globalization issue, you may repeat your question in this forum:
    Globalization Support
    (So one of the moderators moves the thread to this forum).
    Werner
    By the way my local timezone is Europe/Berlin, there should be no difference to Amsterdam.

  • Data conversion Issue while executing the package

    I am executing a package for which the source is XML and destination is SQL server. I have created the package by using xsd and xml as source. When I am trying to execute the package I am getting the below error.
    Any help here would be appreciated.
    -kccrga http://dbatrend.blogspot.com.au/

    The error is obvious. The datatypes of source and destination table solumns dont match. Please change datatypes of your sqlserver table to correspond to those coming from the XML as given by error message above.
    Alternatively apply derived column task after XML source to convert datatype of columns to be same as that of your table assuming you'll have only compatible data values coming from source.
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • HTTPService Data Conversion Issue

    I have a REST-style web service that returns an HTML document
    that looks something like this:
    <a>
    <b>
    <c>2.0</c>
    <b>
    <b>
    <c>3.2</c>
    <b>
    </a>
    I can use the browser to display this document and it looks
    as expected. However, when I use a Flex HTTPService to obtain the
    document, and then display its various values, any numbers ending
    with ".0" are truncated/rounded. For example, the first element
    <c> above is displayed as "2" instead of "2.0".
    Why are numbers being converted? Shouldn't they be treated as
    any other text from an XML element?
    This is how I am calling the service:
    <mx:HTTPService id="myService" url="itsurl" method="GET"
    useProxy="false"/>
    And this is an example that shows a number being
    trucated/rounded:
    <mx:Label text="{myService.lastResult.a.b[0].c}"/>
    Thanks,
    Mike

    The default for HTTPService is resultFormat="object" This
    causes Flex to convert your xml into a tree of dynamic objects.
    This is rarely what you want. Among other things, it tries to do
    type conversions.
    Instead, set resultFormat="e4x". This will give you your xml
    un changed. It also gives you access to the e4x API which is very
    powerful. It includes traversal, search and fileter functionality.
    Additionally, for anything but a trivial or demo app, do not
    bind directly to lastResult. Instead, use a result handler
    function. Here is an example, untested:
    <mx:HTTPService id="myService"... result="onResult(event)"
    (in script)
    import mx.rpc.ResultEvent;
    [Bindable]private var _xmlResult:XML; //instance var to hold
    the data
    private function onResult(event:ResultEvent):void {
    _xmlResult = XML(event.result);
    trace(_xmlResult.toXMLString()); //to see exactly what you
    have
    Note, your xml example is not valid, so I can't show an
    example of an e4x expression.
    Tracy

  • Data Conversion and Derived Column issues

    I have a strange issue occurring with one of my SSIS packages. 
    What needs to happen:
    I have to read data from a table that stores a field in NCHAR(40)
    Send it through a Data Cleansing SSIS Component that forcefully outputs the data at a cleansed state as DT_STR(50)
    Update the same source table with cleansed data - Using an UPSERT third party tool
    Of course, I can't update the tables that stores data in NCHAR(40) with data from DT_STR(50), so I'm trying to use the Data Conversion Component, the Derived Column Component or a combination of, to set the data to DT_WSTR as well as to set the correct length,
    from 50 to 40.
    The Data Conversion Component fails when I try to set the incoming data (DT_STR(50)) to DT_WSTR(40):
    [Data Conversion [186]] Error: Data conversion failed while converting column "MD_Address1" (97) to column "_MD_Address1" (190).  The conversion returned status value 2 and status text
    "The value could not be converted because of a potential loss of data.".
    Now I tried the same thing with the Derived Column Component, converting the data from the DQS component from (DT_STR(50)) to DT_WSTR(40) and there's the error message:
    [Derived Column [196]] Error: The "Derived Column" failed because truncation occurred, and the truncation row disposition on "Derived Column.Inputs[Derived Column Input].Columns[_MD_Address1]"
    specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
    I also tried a combination of Data Conversion (From DT_STR(50) to DT_WSTR(50)) and a Derived Column just casting to the correct size, but that didn't work either. 
    Every time I try to change the size, one of the components fail. 
    It appears that my issue has nothing to do with the data types, but the actual data length.
    Why can't SSIS CAST my input from 50 to 40?
    What do I need to do to cast/convert data from DT_STR(50) to DT_WSTR(40)?

    Hi IgorSantos_FL,
    It is the expected behavior when we try to convert DT_STR(50) (means maximum 50 characters in the value) to DT_WSTR(40) (means maximum 40 characters in the value). However, the truncation issue should not occur if you convert DT_STR(50) to
    DT_WSTR(50). Could you post the error message that you received when converting DT_STR(50) to DT_WSTR(50)? It may be a different issue.
    Regards,
    Mike Yin
    TechNet Community Support

  • Issue with Date Conversion when loading XML File into Oracle 10g Database

    Hello all,
    I have the interface shown in the screenshot below. In it, amongst other actions, I'm mapping an XML file element representing a date to an Oracle table column defined as DATE. The source and target columns are highlighted in the screenshot.
    !http://img223.imageshack.us/img223/1565/odiscr275.jpg!
    When I execute the interface, I get the following error message:
    java.lang.IllegalArgumentException at java.sql.Date.valueOf(Date.java:103)
    I'm assuming this refers to the date conversion!
    I've already tried replacing SRC_TRADES.DEAL_DATE with TO_DATE( SRC_TRADES.DEAL_DATE, 'DD/MM/YYYY' ) in the Implementation tab. This function was not recognised when I executed the interface, so it didn't work! The date value in the XML file is in DD/MM/YYYY format.
    I'm guessing that Oracle SQL Date functions don't work in the Implementation tab. Please could somebody let me know:
    1. Which Date Conversion function I could use instead?
    2. Where I can find a reference for the methods/functions I can use in the Implementation tab (if such a reference exists)?
    Cheers.
    James

    Hi.
    Try to change the execution area to staging area. After You change it, write in the mapping box just SRC_TRADERS.DEAL_DATE. When You use TO_DATE, the source field typu should be varchar2, not date (as it is in your source datastore)

  • Data conversion for new sob

    Dear Viewers
    on 11.5.10
    We are creating a new SOB with a change in currency from Feb-11 as this is the requirement
    For the same, we need to do data conversion.
    I have a confusion for Purchase Orders and Sales Orders
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done?
    What if a return has to be made moving forward in FEB-11 under new SOB
    Sales Orders:
    Open sales orders will be converted, that is the ones that have been entered and not yet booked.
    Users have been requested to clear off the Sales order lines which are already pick confirmed but not yet shipped, hence they will be shipped and interfaced to AR
    For the Sales orders that have been booked, those lines that are not yet processed further will also be converted.
    Now what if a receipt comes after feb 11, how to handle this as the sales order wiould not have been converted?
    Please give your advise on the data migration strategy for PO`s and SO's.
    Please do add any point that may have been missed by me
    Appreciate your help
    Thanks
    Emm

    emm wrote:
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done? <br>Business needs to take a decision whether they are fine with matching the POs manually (i.e. referring the documents and verify), in this case you may capture the PO information in a DFF in the Invoice distribution. Otherwise if it has to be converted identifying the POs under this scenario, you may consider converting those maintaining receipt close tolerance as 100% and matching type as 2 -way (again business approval needed to handle audit issues) in order to avoid the receipts/delivery conversion etc.<br>
    What if a return has to be made moving forward in FEB-11 under new SOB<br>Ideally returns can be done using Miscellaneous/Account Alias Issues specifying the appropriate transaction reasons to clarify the scenario.<br>

  • Data conversion for New GL - Going live during fiscal year

    Hi Experts,
    My client is going live on Oct 1 (fiscal year in Jan -Dec). So for data conversion we need to load P&L and BalanceSheet as of Sept 30, 2009.  I need your help to resolve few open issues. (we have activated Doc Splitting with Zero Balance and Segment )
    1) When we load P&L balances - how can I assign Profit Centers to all these P&L iitems .  
    2) What is the procedure (best practices) to assign Profit Centers to all the Balance Sheet items since we are going live during Fiscal Year
    3) When we process open items  (AR, AP) in month of October how the Profit Center assignment will work?
    (My client will be using Profit /Cost center terminology first time, so we need to map all the B&S and PnL items to Profit centers)
    4) What is the best way to map all the Balance Sheet GL accounts to Profit Centers.
    Sorry for asking too many questions. Any help would me very much appreciated.
    Thanks,
    Sam

    1) When we load P&L balances - how can I assign Profit Centers to all these P&L iitems .
    Each P&L account balance has to be upload with cost centre, then automatically PC will be triggered.
    2) What is the procedure (best practices) to assign Profit Centers to all the Balance Sheet items since we are going live during Fiscal Year
    Let client decide the profit centre for balance sheet items, since they knows which balance relates to which pc
    3) When we process open items (AR, AP) in month of October how the Profit Center assignment will work?
    While uploading AR,AP balances, each line item to be uploaded with profit centre, automatically in october PC assignment will work.
    (My client will be using Profit /Cost center terminology first time, so we need to map all the B&S and PnL items to Profit centers)
    4) What is the best way to map all the Balance Sheet GL accounts to Profit Centers
    Try to identify each balance sheet gl balance to each profit centre, this will help in future reports. otherwise put in common pc

  • LRAW data Conversion during Unicode Conversion

    Hi,
    We have upgraded our 46C MDMP system to Unicode system.
    After conversion we found that some of the tables with field of LRAW type are having conversion issue.
    In 46C system, Russian data which enetered in the MS Cyrillic (Code page 1504), is converted to code page 1500 after unicode conversion.
    We have not found any entries for this table in vocabulary, reprocess log or in SUMG.
    For example table STXL. This table contains Purchase order text for the material.
    Report HRUU_CONVERT_IN_UNICODE does not shows LRAW data field for conversion.
    Can you suggest how LRAW data can be converted from 1500 to 1504 code page.
    Thanks and Regards
    Rahul

    Hi Rahul,
    STXL is a language dependent INDX type table. Hence SPUMG will not scan this one with the default settings.
    Please have a look at the description of INDX type table handling in the Unicode conversion Guide.
    In the Unicode system, you can use UMG_ADJUST_STXL for the conversion.
    However please be aware of the restrictions:
    Once you have decided about the code page and converted the object, this cannot be reverted !
    Therefore it is highly recommended to check the display mode first and in addition test it extensively on a sandbox system !
    Best regards,
    Nils Buerckel
    SAP AG

  • Conversion issue in encoding in PI

    Hi,
      I am having a problem in conversion issue for encoding. Let's have a detailed look into the scenario first.
    Scenario: SNC->PI (through Proxy)->MQ (through Receiver JMS adapter)->SeeBeyond (It's a middleware system).
                   Here from SNC, data are coming into PI as in UTF-8 encoding, whereas all the systems shown in the above flow are of ISO 8859_1 format. Hence UTF-8 needs to be converted into ISO 8859_1. PI does this conversion in Receiver JMS adapter. CCSID has been set to 00819 which is for ISO 8859_1 encoding. But the beginning of the xml, it remains still UTF-8, like below:-
      <?xml version="1.0" encoding="utf-8" ?>
    Whereas the content of the xml has already been converted into ISO 8859_1 by Receiver JMS adapter. Now let's see the problem now.
    Problem: After the data been passed in SeeBeyond, the mapping fails in SeeBeyond, since SeeBeyond tries to look at the XML tag from the beginning and it found it's in UTF-8 as per he tag above. Then it tries to map it in UTF-8 format whereas the content is in ISO 8859_1, so mapping fails in SeeBeyond. Please note, here allthe systems except SeeBeyond are just bypassing the data, no mapping been introduced in anywhere in the systems except SeeBeyond.
    Workaround: Is there any other chance where we can change the tag header from UTF-8 to ISO 8859_1 in Receiver JMS adapter itself in PI like this:-
    <?xml version="1.0" encoding="iso 8859_1" ?>
    This might resolve our problem I think. Any suggestions would be appreciated.
    Thanks and Regards
    Soumya

    Hi Stefan,
                Thanks for your reply. I have gone through the link you have specified. Ya, that's the other way to resolve that issue in that case. But here we have asked the receiver system (SeeBeyond) to change the encoding part in their mapping. Because changing the coding part in SeeBeyond is less time consuming than adding a new Bean in JMS adapter. Our problem got resolved by changing the mapping in Receiver end. Thanks anyway for your valuable reply.
    Thanks
    Soumya

  • Data Conversion Logic

    I am now trying to implement a data conversion logic(such as UOM field), I want to load all the data conversion rules in the cache from file and keep them in the cache. So when mapping starts it just picks up the data conversion rules from the cache.
    Using GlobalContainer it is ok for one mapping even several UOM fields(just load from file once). But it will load from file again when the mapping is triggered again.
    Is there any logic I can use to implememt the data conversion just load from file once?
    Regards,
    Nick

    Hi Arnold,
    normally every input should be made in LC, why you enter the data in group currency?
    However if you have to start from group currency the only solution to avoid this error is to put at the end of sprunconversion the ON_ERROR_CONTINUE clause.
    But remember he will stop to notify every type of error so you have to be very careful and have this clear in mind when some issue occurs with the currency conversion.
    Regards
         Roberto

  • Data Conversion Errors for the last week

    We've been running a simple Stream Analytics job for little over a month now with a very light workload. Input is Event hub and output SQL Server. We noticed today that we haven't received anything into SQL Server since 2014-12-08 (we don't receive events
    every day so we only know that everything still worked on the 8th of December), so we checked the job's logs. It seems that job is failing to process all the messages: The value of "Data Conversion Errors" is high.
    I wonder what could have happened? We haven't touched the client since we started the job so it's still sending the messages in same format. And we haven't touched the job's query either.
    Has there been an update to either to Stream Analytics or to Events Hub which could cause the issue we're seeing?

    I've followed word for word the TollApp Instructions (except the thing with NamespaceType "Messaging" that has been added to New-AzureSBNamespace).
    I have 0 line in output, and this is the service log:
    Correlation ID:
    e94f5b9e-d755-4160-b49e-c8225ceced0c
    Error:
    Message:
    After deserialization, 0 rows have been found. Possible reasons could be a missing header or malformed CSV input.
    Message Time:
    2015-01-21 10:35:15Z
    Microsoft.Resources/EventNameV2:
    sharedNode92F920DE-290E-4B4C-861A-F85A4EC01D82.entrystream_0_c76f7247_25b7_4ca6_a3b6_c7bf192ba44a#0.output
    Microsoft.Resources/Operation:
    Information
    Microsoft.Resources/ResourceUri:
    /subscriptions/eb880f80-0028-49db-b956-464f8439270f/resourceGroups/StreamAnalytics-Default-West-Europe/providers/Microsoft.StreamAnalytics/streamingjobs/TollData
    Type:
    CsvParserError
    Then I stopped the job, and connected to the event hub with a console app and received that:
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    85,21/01/2015 10:24:56,QBQ 1188,OR,Toyota,4x4,1,0,4,361203677
    Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    33,21/01/2015 10:25:42,BSE 3166,PA,Toyota,Rav4,1,0,6,603558073
    Message received. Partition: '11', Data: 'TollId,EntryTime,LiMessage received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    59,21/01/2015 10:23:59,AXD 1469,CA,Toyota,Camry,1,0,6,150568526
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    25,21/01/2015 10:24:17,OLW 6671,NJ,Honda,Civic,1,0,5,729503344
    Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
    51,21/01/2015 10:24:23,LTV 6699,CA,Honda,CRV,1,0,5,169341662
    Note the bug on the 3rd message. In my opinion it's unrelated, it could be the writeline that can't keep up with the stream in the console application. And at worst it's in the stream, but then I should see at least some lines in output for the correctly
    formatted messages.

  • Language conversion issue in PI7.1

    Hi,
    My sceanrio is file to Idoc. Am facing language conversion issues.
    For example one of the field in the source file is having the value "Différence sur net". And in the IDOC field the value is appearing as "Différence sur net".
    With the same data I have done file to file scenario with and without content conversion and the output is coming correctly as "Différence sur net".
    I have enabled Unicode check box in RFC destination also in XI server. But still its coming "Différence sur net" . Please advice how can I get the target field value same as that of the source field vale.
    Can anyone help me to sort out this issue?
    Thanks.
    Dinesh

    Hi,
    In Channel try with using the File type: text and encoding : UTF-8.
    I think this will solve your issue.
    Thank you,
    Siva

  • Date conversion for internal table

    Hi experts,
    I have an issue with Date conversion.
    i have declared my internal table with type same as that of standard table from which im fetching data.
    The date is coming in YYYYMMDD format and i have to change it to MM/DD/YYYY format.
    I tried using a function Module and used Mask as well . It is working but wen im putting the value back into my internal table it is truncating my conver sion .
    For eg : 20110530 after conversion and wen im putting it back to internal table value is : 05/30/20
    bacuse of length issue "11" disappeared.
    I tried changing the type declaration of my internal table to : date(10) type c.
    But wen im executing the code, the place were im putting the values from my standard table to internal, it is giving run time error.
    can anyonenhelp me with this?

    Hi,
    Why not have the date in MMDDYYYY format truncating the '/' part. Doing this would reduce the size to 8. Which can later be added in any field level rotine.
    Regards
    Raj Rai
    Edited by: Raj Rai on Jun 1, 2011 5:48 AM

Maybe you are looking for