Data conversion: Double to Number

What is the minimum size, presision of Number column when the column is bind to a double in C++/OCI?

What is the minimum size, presision of Number column when the column is bind to a double in C++/OCI?

Similar Messages

  • Webi report - save as .csv is bringing double the number of rows.

    Hello all,
    I am running a webi report and then saving it as .csv file to my desktop but when I save the file it is giving me double the number of rows. In my report I have employee, employee key, attributes and Org unit and some measures.
    The .csv file has one additional column at the end which does not have any header and that column has "1" in the rows which should be legitimate and "0" in the rows which are so called duplicates, in these duplicate rows there is all data except it is missing the employee key and org unit key.
    If I save as excel I get the right number of rows.
    has any one seen this issue before?
    Thanks in advance,

    Exporting to csv is different from exporting to csv.
    If you have any filter on your crosstab or table and you export it to excel it will show you your data according to this filter.  Let's say in your table you have amount 0 and 1 and you filter that column so it will show only those records where amount = 1.  If you export it to excel you'll get what you see in your WebI report, that's only records with amount = 1.
    But if you export to csv the same Webi report with the same crosstab or table filtered by amount=1, the csv export will ignore this filter and your csv will include amount 0 and amount 1 records.
    I don't know if that's a WebI's  bug, but that's what has happened to me.
    A workaround could be adding your filter directly in the query pane, not filtering your table columns.
    Also check your query pane to see what object is bringing that extra column in your table.

  • Date conversion logic using function modules

    Hi there,
    I have been researching in the forum how data conversion is done in ABAP and I found out about the function modules
    CONVERSION_EXIT_LDATE_OUTPUT
    and
    MONTH_NAMES_GET
    It seems that the first function module can convert the date format 06/08/2011 to MONTH DD,YYYY and the second function can easily translate the month names once language SPRAS has been specified.
    Problem is I am not sure how I can convert my VBRK-FKDAT with format MM/DD/YYYY into the format DD MONTH YYYY format with the MONTH still dependent on the specified language.
    I tried to look at how to MONTH_NAMES_GET FM works and made this code:
    DATA: d_return like sy-subrc,
          itab_T247 like T247 occurs 0 with header line,
          gs_spras type spras.
    gs_spras = 'RU'.
    CALL FUNCTION 'MONTH_NAMES_GET'
    EXPORTING
       LANGUAGE                    = gs_spras
    IMPORTING
       RETURN_CODE                 = d_return
      TABLES
        MONTH_NAMES                 = itab_T247
    EXCEPTIONS
       MONTH_NAMES_NOT_FOUND       = 1
       OTHERS                      = 2
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
             WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    Loop at itab_t247.
    Write:/ itab_t247-LTX.
    endloop.
    The code just lists the months in Russian (which is the language I need). I am just new to ABAP so I am an avid reader of the forum. I hope someone can help.
    Thanks,
    dgrachee

    Hi Dgrachee ;
    You can change it as follows.
    Best Regards.
    gs_spras = sy-langu. "'RU'.
    CALL FUNCTION 'MONTH_NAMES_GET'

  • Data conversion of material BOM by LSMW

    Hi,
    Right now I try to work on LSMW setting for data conversion of material BOM. However, I forgot how to set
    the linkage of BOM header & item at LSMW since my last
    experience was 1 year ago. I have set up 2 source structures, BOM_HD (header) & BOM_IT (item) from 2 files. Both source
    structures contain one same source field, BOM_ID. From this identical BOM_ID field, the linkage of BOM header and items is set up between 2 files. Can
    anyone know how to make the setting at LSMW ?
    Thanks.
    Regards,
    Nancy

    hello nancy,
    their is no need of getting seperate reacording for both header and item. it will be enough if with only one in th lsmw in the first step u select the direct input method
    and at source structure level create two strucutres such that one can be for header and the other would be for item
    then in the same way while specifying file give first the header file and then the item file.
    at the same step check whether the convert data is getting the logical path and file name or not if u dont then double click at that possition and create it. then ur recording will be done perfectly.
    if u want the entire steps with screen shot i can send u

  • Epoch Date Conversion

    All,
    Im trying to convert an Epoch number to a Date format. The
    number is the total milliseconds from Jan 1, 1970 to today and is
    in GMT. We are physically located in Central Time. And I would like
    to convert it back to this format 'mmddyyyy'...
    Any help will be appreciated.
    Thanks!

    BigFeetBigShoes wrote:
    > All,
    > Im trying to convert an Epoch number to a Date format.
    The number is the
    > total milliseconds from Jan 1, 1970 to today and is in
    GMT. We are physically
    > located in Central Time.
    well if accuracy isn't such a big deal, divide the epoch
    offset by 1000 to get
    seconds then simply use dateadd (it might overflow w/ms).
    otherwise might try
    using java.util.Date & init it w/your epoch offset. cf
    should be able to use
    that datetime that it produces. i think the conversion will
    also handle the
    conversion to server timezone (tz). sorry this is from
    memory, at home w/out
    access to a server to test.
    if you need other tz then have a look at:
    http://www.sustainablegis.com/projects/tz/testTZCFC.cfm
    but also read these, especially the tz hell entry:
    http://www.sustainablegis.com/blog/cfg11n/index.cfm?mode=cat&catid=162AEDB4-20ED-7DEE-2A6C F1B79AC2E03A

  • Compliance Settings - compliance counts inaccurate (roughly double the number they should be)

    I'm deploying some fairly simple configuration baselines - most with a single CI in it; some remediate, some do not.
    In all cases, the Compliance count (as found on the Assets and Compliance/Compliance Settings/Configuration Baselines page in the SCCM console) is roughly double the number it should be.
    The same numbers can be found in the report: Compliance And Settings Management \ Summary Compliance by configuration baseline. In that report, however, you can drill into the individual baselines, which takes you to "List of assets by compliance state
    for a configuration baseline". I find that if I add up all of the assets from this report, the number is accurate.
    We have about 4000 people in one of these CI baseline deployments, and it's showing about 8000 are compliant. We don't even have anywhere near that many clients. In another one, we have about 150 users and it's showing 280 compliant. 
    It just seems that the summarization data is somehow inflated, and completely inaccurate. Anyone know how that is calculated, or have you seen this before?

    Following some leads on my own research...
    The summarization process lives in WMI, of course, like everything else. It's apparently in root\sms\site_[SiteCode]:SMS_SummaryTask, executing the method: RequestExecution.
    The "TaskCommand" on the SMS_SummaryTask shows that it's actually executing the stored procedure: spTaskDCMComplianceSummary.
    That stored procedure loops through all the configuration baselines, and executes the stored procedure spDCM_SummarizeBaseline for each of them.
    That stored procedure ends up running a query that looks like this:
    select SiteNumber, CI_ID, SummaryType, CountTargeted, CountCompliant, FailureCount, ActivatedCount, CountNoncompliant, EnforcedCount, Severity, LastSummaryTime from
    CI_ComplianceSummary where CI_ID=@bl_ciid 
    I set the @bl_ciid and ran that select query. I got results that look like this:
    SiteNumber CI_ID SummaryType CountTargeted CountCompliant FailureCount ActivatedCount CountNoncompliant EnforcedCount Severity LastSummaryTime
    1 16795052 1 0 153 2 0 0 4 3 2014-02-13 23:33:31.633
    1 16795052 2 0 141 2 0 0 4 3 2014-02-13 23:33:31.637
    The number for this configuration baseline in the summary page is 294. That just so happens to be these two things added together (153+141). 
    Does anyone know what these two things are? SummaryType = 1, SummaryType =2 ? It seems like the summarization process is adding them together, when it should not be.

  • Data Conversion - BP and Item

    Dear all,
    Would like to ask for the advice of Data conversion of BP code and Item Code.
    Any guideline ?
    i know it is not suggested to do so, but i really need to change the bp code and item code.
    Thank you very much

    Hi Tac,
    once journal entries are linked to any master data, the code cannot be changed anymore. If you are just implementing & have a very low number of transactions, you might consider creating a new database using copy express & then just import your amended codes & start working with tha new db.
    Alternatively, you could group all the 'old' codes in an appropriately named BP/item group (e.g. DoNotUse_BP) & mark them as 'on hold'. Then create new master data with the correct codes use opening balances. The master data could carry a header udf where the old & new codes are referenced.
    If you do this, be careful if you use FIFO items, you might want to import the already posted transactions using the now item codes to recreate the layers, instead of starting with 1 opening layer.
    All the best,
    Kerstin

  • I just opened an "old" project in a newer version of iPhoto and the pages in the book doubled in number with the seam in the middle of each page.  Suggestions?

    I just opened an "old" project in a newer version of iPhoto and the pages in the book doubled in number with the seam in the middle of each page.  Suggestions?

    Start with Try the following:
    1 - delete the iPhoto preference file, com.apple.iPhoto.plist, that resides in your
         User/Home/Library/ Preferences folder.
    2 - delete iPhoto's cache file delete the contents the following folder:
    User/Library/Containers/com.apple.iPhoto/
    Data/Library/Caches/com.apple.iPhoto
    3 - reboot, launch iPhoto and try again.
    NOTE: For Mavericks, 10.9,  go to your Home folder and use the View ➙ Show View Options menu to bring the this window:
    where you can check the Show Library Folder checkbox.
    If that doesn't help continue with:
    1 - launch iPhoto with the Command+Option keys held down and rebuild the library.
    2 - Run Option #4, Rebuild Database.
    Happy New Year

  • How to display dynamic datas and double datas (chart)

    Hi all !
    I would like to display dynamic datas and double datas by using charts.
    However, it doesn't work and I don't understand why.
    Can you help me to fix that ?
    Best regards,
    - John
    Attachments:
    double_dynamics_data.vi ‏54 KB

    Hi John,
    the sine looks like it should look with your VI!
    You convert the DDT to a scalar DBL. That conversion will only use the first sample of the generated sine waveform - usually this is always the same (offset) value when you generate full periods… Or to put it in a mathematical expression: sin(0°)+4 = 4!
    When you want to display waveform you should work with waveforms!
    Actually, I have used Dynamic data type in order to simulate an signal.
    Well, I also simulated a signal in my example VI. I didn't use any ExpressVI to do so…
    In a few weeks, I will receive a sensor. I will use in LabView 'DAQ Assistant' whose the ouput is a Dynamic Data.
    Nobody forces you to use the DAQAssistent ExpressVI. There are nice DAQmx functions available and LabVIEW comes with a lot of ready-to-use example VIs…
    That's why, I used in first place, the 'Simulate signal'. So I think I should keep Dynamic Data Type
    I do think you don't need to use the DDT. I think it's better to understand what is going on in your VI. And I think the DDT will be no help for you…
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

  • What is the transparent table to get status date of a QN number ?

    Hi all,
    I have a requirement to get the date based on status of a QN number. For example, a QN number might have status as APNO ( No approval) then I approve with ok then It becomes APOK, or I can I approve it with refused then It becomes APRF. And when all the tasks are completed It becomes ATCO.
    Now I want to get APNO date, APOK date, APRF date and ATCO date for the QN number. What is the transparent table I can get the inofrmation ?
    Thanks in Advance,
    Hung

    - Current statuses are stored in JEST (OBJNR = QMEL-OBJNR) change log are in JCDS, status user profile in JSTO. Records are created in JEST first time status is set, when setting/unsetting a status a record is appended to JCDS.
    - Don't forget to convert the status in internal status, and distinction between system (internal Innnn, conversion via TJ02T) and user (external Ennnn, conversion via TJ30T) statuses. Take a look at FM STATUS_TEXT_EDIT or STATUS_NUMBER_CONVERSION and STATUS_TEXT_CONVERSION.
    Regards,
    Raymond

  • Char to Date conversion problem - again!

    Hi,
    in order to make date selection 'more attractive' for end users I have been tasked with presenting a text field with 'January 2010', 'February 2010' etc to the end users for their prompts.
    So far so easy...
    However, I then need to use the values to derive either dates, or years, or date numbers.
    This all works fine during development when I am testing using a default parameter thus; -
    '@{MYMONTHYEAR}{September 2010}' - which is fed into various very complex functions.
    BUT - when I try to use the report in conjunction with the parameter I get; -
    'The evaluation of the index 6 expression on row count -1 failed. [nQSError: 46046] Datetime value 2010/09/01 does not match the specifiied format'
    However, cast('2010/09/01' as date) works fine... and convoluted as the logic of my functions is, the fundamental char to date conversion is exactly in this format.
    Anyone shed any light on this, give me a guaranteed way to handle char to date conversion?
    thanks,
    Robert.

    Hi,
    I tried your workaround case when 1=2... but I still get the same error....
    Not when I have the format from my default value, but when I get the value from a dashboard prompt in a page.
    The full error is; -
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 46046] Datetime value 2012/09/01 does not match the specified format. (HY000)
    SQL Issued: SELECT s_0, s_1, s_2, s_3, s_4, s_5, s_6, s_7, s_8, s_9, s_10, s_11, s_12, s_13, s_14, s_15, s_16, s_17, s_18, s_19, s_20, s_21, s_22, s_23, s_24, s_25, s_26, s_27, s_28, s_29, s_30, s_31, s_32, s_33, s_34, s_35, s_36, s_37, s_38, s_39, s_40, s_41, s_42, s_43, s_44, s_45, s_46, s_47, s_48, s_49, s_50, s_51, s_52, s_53, s_54, s_55, s_56, s_57 FROM ( SELECT 0 s_0, "Vivaldi Detail"."Product Global Hierarchy"."Tobacco Category Groups" s_1, "Vivaldi Detail"."Product Global Hierarchy"."Tobacco Category" s_2, "Vivaldi Detail"."Product Global Hierarchy"."Total Tobacco Products" s_3, FILTER("Vivaldi Detail"."Fact Sales Detail"."SOM %" USING ("Vivaldi Detail"."Time"."Year" =cast((cast(right('September 2012',4) as integer)-1) as char(4)))) s_4, FILTER("Vivaldi Detail"."Fact Sales Detail"."SOM %" USING ("Vivaldi Detail"."Time"."Year" =cast((cast(right('September 2012',4) as integer)-2) as char(4)))) s_5, FILTER("Vivaldi Detail"."Fact Sales Detail"."SOM %" USING ("Vivaldi Detail"."Time"."Year" =cast((cast(right('September 2012',4) as integer)-3) as char(4)))) s_6, FILTER(FILTER("Vivaldi Detail"."Fact Sales Detail"."SOM %" USING ("Vivaldi Detail"."Time"."Year" =cast((cast(right('September 2012',4) as integer)-1) as char))) USING "Vivaldi Detail"."Time"."Month Number" <=month(CASE WHEN 1=2 THEN CURRENT_DATE ELSE cast(right('September 2012',4)||'/'||CASE left('September 2012',LOCATE(' ','September 2012')-1) WHEN 'January' THEN '01/01' WHEN 'February' then '02/01' when 'March' then '03/01' when 'April' then '04/01' when 'May' then '05/01' when 'June' then '06/01' when 'July' then '07/01' when 'August' then '08/01' when 'September' then '09/01' when 'October' then '10/01' when 'November' then '11/01' ELSE '12/01' END as date)END)) s_7, FILTER(FILTER("Vivaldi Detail"."Fact Sales Detail"."SOM %" USING ("Vivaldi Detail"."Time"."Year" =right('September 2012',4))) USING "Vivaldi Detail"."Time"."Month Number" <=month(CASE WHEN 1=2 THEN CURRENT_DATE ELSE cast(right('September 2012',4)||'/'||CASE left('September 2012',LOCATE(' ','September 2012')-1) WHEN 'January' THEN '01/01' WHEN 'February' then '02/01' when A brief sample of a typical formula which causes this is; -
    FILTER(FILTER(FILTER("Fact Sales Detail"."SOM %" USING ("Time"."Year" =   right('@{MYEAR}{September 2010}',4))) USING ("Time"."Month Number" = 2)) USING "Time"."Month Number" <= month(CASE WHEN 1=2 THEN CURRENT_DATE ELSE cast(right('@{MYEAR}{September 2010}',4)||'/'||CASE left('@{MYEAR}{September 2010}',  LOCATE(' ','@{MYEAR}{September 2010}') -1) WHEN 'January' THEN '01/01'  WHEN 'February' then '02/01' when 'March' then '03/01' when 'April' then '04/01' when 'May' then '05/01' when 'June' then '06/01' when 'July' then '07/01' when 'August' then '08/01' when 'September' then '09/01' when 'October' then '10/01' when 'November' then '11/01' ELSE '12/01' END as date)  END    ))The September 2010 default being EXACTLY what is selected from the dashboard prompt when testing. It does not error, but returns no data in isolation, but when when from the prompt it errors...
    And the date format, if fed into a cast ('2012/09/01' as date) does not error.
    Anyone help - this is really driving me mad, syntax is good - it should work!!!
    thanks,
    Robert.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Data conversion error. Overflowed the specific type.

    Hi guys, as usual I am struggling with the data conversion. I have got a column BS as Float in SQL, the usual values is 10.5656445, 899.66552366 etc. I'll try to move in Excel the data but I wish to have only one decimal. I try decimal in the data conversion
    with precision 1 but it returns me the same value now I am trying
    numeric, precision 1 but it returns the error "Conversion failed because the data value overflowed the specified type". Any suggestion? Thanks

    Could you tell me what should be the result for 10.5656445, 899.66552366?  Is that 10.5, 899.6 ?
    If yes then, set Precision: 4, Scale: 1
    (This is a total of 4 digits, 3 digits for the whole number and 1 for the fractional)
    Cheers,
    Vaibhav Chaudhari

  • HT1689 since doing the 7.0.4 update i notice that my mobile data has doubled and i am running out of data mid month.  I've disabled apps s not many running.  Any suggestions?

    since doing the 7.0.4 update i notice that my mobile data has doubled and i am running out of data mid month.  I've disabled apps s not many running.  Any suggestions?

    I am having the same issue even with the newer 7.1. I have a 1-100 KB data push on both my husdand's and my phone every hour on the hour (i.e. my will be at :23 of the hour and his will be at :13 of the hour and it will change). This is whether we are on a WIFI and with all apps, background refreshes, push notifications, etc turned OFF.

  • GL Data Conversion from Legacy to Oracle EBS

    Hi All,
    I am tasked with converting legacy data into GL. I have worked on other conversions like supplier, AR and AP conversions but never worked on GL conversion, so I don't know how and where to start, what are the pre-requisites for entering data into gl_interface table how to ensure that data is correct in the .csv file etc.
    Kindly request you all to guide me in right direction. Hope to hear from you all very soon!!! :)
    Regards

    Hi,
    When you talk about GL-Data it is the trial balance to be loaded from legacy to oracle applications.
    You can use Web-ADI to upload it ,the check list are
    The balance for each account comibnation in the legacy system to be mapped to GL-Oracle code combination balance
    With respect to open AR, AP Invoices if the invoices are converted using a control account for migration in GL then
    the the balance transfered from AP,AR need not be reversed in GL from the source receivables and payables.
    In case if the same account combination are for migrating the balances from AP,AR to GL ..Then the balances transferred from AP,AR should be reversed ..so that it does not affect the TB-GL-Balance
    Finally ensure that the balance for the TB-tallys with your legacy system and upload it using Web-ADI ,import and
    review and post it.
    Hope this points helps your GL-Data conversion.
    Regards,
    Ramaa

  • Data conversion strategy for new SOB

    Dear Viewers
    on 11.5.10
    We are creating a new SOB with a change in currency from Feb-11 as this is the requirement
    For the same, we need to do data conversion.
    I have a confusion for Purchase Orders and Sales Orders
    Purchase Orders:
    Open purchase orders will be converted, means the unfulfilled PO`s i.e the ones not received and are fully open.
    The PO`s which have been recieved but not delivered, Requested the users to clear the intransit receipts.
    The PO's which are partially received, what to be done for them?
    If a PO is fully received and Delivered will not me converted to the new SOB as its not an open PO
    but If invoice comes after Feb-11, than how the matching will be done?
    What if a return has to be made moving forward in FEB-11 under new SOB
    Sales Orders:
    Open sales orders will be converted, that is the ones that have been entered and not yet booked.
    Users have been requested to clear off the Sales order lines which are already pick confirmed but not yet shipped, hence they will be shipped and interfaced to AR
    For the Sales orders that have been booked, those lines that are not yet processed further will also be converted.
    Now what if a receipt comes after feb 11, how to handle this as the sales order wiould not have been converted?
    Please give your advise on the data migration strategy for PO`s and SO's.
    Please do add any point that may have been missed by me
    Appreciate your help
    Thanks
    Emm

    Hi David,
    for master data conversion you can use LSMW and the RE-FX BAPIs. (please refer to SAP note  [782947|https://service.sap.com/sap/support/notes/782947] ).
    Regards, Franz

Maybe you are looking for

  • My keyboard layout has changed. I no longer have an @ key.

    About an hour ago, my keyboard layout changed (but only when using firefox) so that, for example, Shift+2 no longer produces @, but rather double quotes ("). Also, pressing the [ key now produces nothing, but the next key I hit after it will produce

  • Import doesn't work for me on Vista

    I am running on Vista. I try to import from a CF card to my Users/John/Pictures directory. Although Lightroom sees my card and the files on it, when I try to do the import I get a dialog box saying "The following files were not imported. Could not co

  • How do you make it send an email on a condition? Or automate emails?

    How do you make it automatically send emails or even on a condtiion? (like qty left of inventory is 0)

  • DST - Choose the key from the allowed namespace

    Hi everyone. I am trying to add the new Timezone setting as per note 919538. When I do, I get a message "Choose the key from the allowed namespace". Another search in OSS said that this can only be correct with a support pack. Does anyone else have e

  • Essbase Studio Deployment Dimension Build Errors

    I have a ASO cube used to handle Subsidiary financial reporting to parent. Rather than alternate hierarchies, we use attribute dimensions for yearly differentiation. The ASO cube typically holds information for Current year + 3 prior years. This is t