Adding a time column

I would like to have a custom column which displays just the start time (and similar for the finish time) in hrs:min:sec format. It's to help export the tasks to an external program which will recognize csv files, but which has start time and start date
in two different columns. I know if I set the date format correctly in options, I can get it display just a date in the start column, but I now need to fabricate a column which will show just the time.
I've tried various formulae in the custom columns, but can't seem to find one to do the job, can anyone help?

>> start time and start date in two different columns. 
Try Format( [Scheduled Start], "mm/dd/yyyy") and
Format( [Scheduled Start], "hh:nn:ss" ) in custom text fields. -- Regards, Ismet
Ref. http://msdn.microsoft.com/en-us/library/office/gg251755(v=office.15).aspx

Similar Messages

  • Is the time column in VA05 the time the sales order created?

    i have added a time column in VA05. May i know whether it is the time the sales order is created? How do i know if it it the created time? Need to know the sales order created time, for example at 7:00. Thank you

    Hi
    Yes this is the order creation date. You can cross check by seeing the time by checking the entry in table (VBAK-EZERT) by passing the VBAK-VBELN as the order number.
    Thanks
    Indranil

  • Adding 18 hrs to a Date/Time column

    Hi All 
    Power Query provides the functionality to add days to a Date/Time  column with the  " Date.AddDays()"  function. However this appears to only accept whole numbers as inputs.
    I want to add 18 hrs to a Date/Time column  How can I do this?
    Cheers
    Steve

    I was able to achieve this by converting the date into a number, adding 0.75 and then converting back into a date.
    Here is the code, source is just a workbook with the following data
    let
        Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
        AddC= Table.AddColumn(Source, "Time", each Number.From([DateTime])),
        ValueAdd = Table.AddColumn(AddC, "Custom", each [Time] + 0.75),
        HoursAdded = Table.AddColumn(ValueAdd, "Time +18", each DateTime.From([Custom]))
    in
       HoursAdded
    Final Output
    Would be interested if there is a more elegant alternative

  • Issue adding a new column to OBIEE presentation layer

    Hi,
    I am pretty new to OBIEE so this may be a silly question. I am facing some problems trying to add a new column from a physical table the OBIEE presentation layer.
    Below are the steps I followed:
    Task: add a product category set to the presentation layer.
    Steps:
    1) First verified that the product category set is being populated in the W_PROD_CAT_DH table in the datawarehouse. this was done by running the following sql
    SELECT W_PROD_CAT_DH.TOP_LVL_PRODCAT_NAME
    FROM W_PROD_CAT_DH, W_INVENTORY_PRODUCT_D
    WHERE W_PROD_CAT_DH.integration_id = W_INVENTORY_PRODUCT_D.INV_PROD_CAT1
    This sql gave the right category value.
    2) created a "Alias" of the table W_PROD_CAT_DH in the OBIEE Admin tool physical layer. (done by right clicking W_PROD_CAT_DH >>New Object>> Alias)
    3) named the Alias of W_PROD_CAT_DH as "Dim_W_INV_MKT_CAT" and created a key of the alias named "integration id" on the column "integration_id"
    4) opened the pre-existing alias of W_INVENTORY_PRODUCT_D the alias was called "Dim_W_INVENTORY_PRODUCT_D" and added a key called "MKT_CAT" on the column "INV_PROD_CAT1" in this alias (Dim_W_INVENTORY_PRODUCT_D of the table W_INVENTORY_PRODUCT_D)
    5) in the alias of the "W_INVENTORY_PRODUCT_D" called "Dim_W_INVENTORY_PRODUCT_D" added a "Foreign Keys" the expression of the key is following: ("Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_INV_MKT_CAT"."INTEGRATION_ID" = "Oracle Data Warehouse"."Catalog"."dbo"."Dim_W_INVENTORY_PRODUCT_D"."INV_PROD_CAT1")
    Logical layer
    6) Opened the existing logical folder called "Dim - Inventory Product" and clicked on "Sources" tab then double clicked the source "Dim_W_INVENTORY_PRODUCT_D" and added a new mapping under "Map to these tables" under "General" tab. The column that was added was the physical alias "Dim_W_INV_MKT_CAT"
    7) added a new column mapping to the same logical table source (Dim_W_Inventory_PRODUCT_D) this new mapping was a column from the alias "Dim_W_INV_MKT_CAT" (column name "TOP_LVL_PRODCAT_NAME")
    Presentation Layer
    8) dragged the newly added column (TOP_LVL_PRODCAT_NAME) from the logical layer "Dim - Inventory Product" to the presentation layer "Product" folder.
    ISSUE
    after adding everything and checking global consistency and save and checking-in my work when i login from the front end presentation services. I see the newly added column is showing under the "Product" folder. on dragging the column by itself to the Ad-Hoc analysis I can see the different values of the category. On adding a second column from the same logical folder (Dim - Inventory Product in logical layer) i still see the right product name and corresponding category. BUT when I drag any other column from any other folder (such as "Day" from "Time" or any fact values) the result does not fetch any data (message: The specified criteria didn't result in any data.)
    I am not sure if I am missing any step but I know I am mapping the new table alias to the inventory_product_d since i see right results on creating analysis of columns in those two folders. but no other folders give me any data.
    I also know that the logical folder "Dim - Inventory Product" is joined with other tables as I can see results when i do not add the newly added catagory column but other folder and other columns of "Dim - Inventory Product"
    I would really appreciate any insight to this very much. we are using OBIEE 11.1.1.6 with the Oracle 11.5.10 Adaptor (SDE_ORA11510_Adopter)
    I will try to upload some screenshots if needed but presently not sure if its something really simple.

    Hi Prassu,
    thanks for the responce. and Apologies for the delay in getting back to you.
    1) 1.First of all set the logging level to 3 in the admin tool save it.
    You can get the SQL query from here
    settings(Answers right top)-->Administration-->manage sessions-->view log
    When I try to do this I get the following message. and no log files.
    Error
         Log Could Not Be Retrieved
    Odbc driver returned an error (SQLExecDirectW).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 27002] Near <3790667783>: Syntax error [nQSError: 26012] . (HY000)

  • Getting the Headings and Time Column on Excel Spreadsheet

    Hi,
    I am reading a 2-D array of data on an excel spreadsheet. Currently I only have the readings on the spreadsheet. Ideally I would like to have a time column in the spreadsheet as well and also the ability to add column name to each column of data read in the spreadsheet. Is there any way through which I can do this? I have added an image of the wiring I am doing for my interface below.
    Thanks,
    Rohit

    Attached is how I save data and time stamp to a csv file.
         The header information is on the far left, before the loop.
    It's probably considered an  "Old School" way of doing things.
    Let me know if this helps.
    Bill
    Attachments:
    Temperature logger LV7 SCC 6-26-07.vi ‏511 KB

  • Sorting on time column is giving error in 3.1 endeca

    Hi All,
    When I am doing sorting on event time column I am getting below error:
    An error occurred while executing query: exception encountered while executing external function 'internal:navigation-query', caused by error endeca-err:MDEX0001 : Invalid input : Error in EQL Query: - In statement "_$Records_1385576435829": In ORDER BY clause: ORDER BY clauses may refer only to attributes defined in the same statement; "EVENT_TIMEYEAR" is not
    I am not sure what this error is.What should be done to remove this or should I remove sorting on this column?
    Thanks,
    Amrit

    Amrit, it is possibly the changed treamment of ARB.
    What kind of error are you getting for those columns that are erroring?
    Please see this topic from the Endeca Server 7.6.x Migration Guide which highlights the differences in EQL (Endeca Query Language), that took place in this release:
    http://docs.oracle.com/cd/E40521_01/server.760/es_migration/toc.htm#Endeca%20Query%20Language%20changes
    You will see the following: (quoting from the linked documentation):
    For EQL 7.6.x, the ARB,COUNT, andCOUNTDISTINCT functions have been updated to work with multi-assign attributes.You will also see that "In 7.6, an implicit ARB is no longer added. That is, a SELECT of non-aggregate attributes is no longer allowed".
    Also, see the following doc, about the ARB function: http://docs.oracle.com/cd/E40521_01/server.760/es_eql/toc.htm#ARB
    In this doc, it is stated:
    ARB works as follows:
    For a single-assign attribute,ARBfirst discards all NULL values and then selects an arbitrary but consistent value from the remaining non-NULL values. If the attribute has no non-NULL values, then NULL is returned.
    For a multi-assign attribute,ARBlooks at all of the rows in the group (including those with empty sets) and selects the set value from one of the rows. In other words, empty sets and non-empty sets are treated equally. This means that because the selection is arbitrary, the returned set value could be an empty set. TheARBreturn type is the same as its argument type: if attribute x is anmdex:long-set, then so is ARB(x). If the attribute has no non-NULL values, then the empty set is returned.

  • Writing 2-D array to spreadsheet file changes value of time column when saved

    I have a program that creates a new row in a table each time the loop performs.  Each time the loop performs and a row is added to the table, the program uses the "write to spreadsheet file.vi" to save this new table over the previous table.  The array is also visible on the front panel of the vi.  The program works in such a way that you can stop adding data to the array, start adding data to a new array, then come back to the original by using the "read from spreadsheet file.vi".  My problem comes after opening the spreadsheet file.  The first column of the array, which corresponds to the time when the data in that row was taken, has different values than those that were seen when the array was originally displayed on the front panel.  In fact, all of the rows that were created the last time the program was run with that array have the same value in the time column, while all of the other columns keep their correct values.  Here is an example of what is happening...
    When I start the program, and run it for 5 iterations, I get a table like this displayed on the front panel:
    3241713920
    85.88012
    93.88012
    1.735015...
    3241713930
    85.97603
    93.97603
    1.747003...
    3241713940
    84.58855
    92.58855
    1.573568...
    3241713950
    84.48096
    92.48096
    1.560119...
    3241713960
    84.93667
    92.93667
    1.617084...
    This array is saved to the spreadsheet file with each added row.  If I initialize and build another array, then come back to this first array using "read from spreadsheet file.vi," I will receive the following table:
    3241713920
    85.88012
    93.88012
    1.735015...
    3241713920
    85.97603
    93.97603
    1.747003...
    3241713920
    84.58855
    92.58855
    1.573568...
    3241713920
    84.48096
    92.48096
    1.560119...
    3241713920
    84.93667
    92.93667
    1.617084...
    If anyone could tell me why this is happening and/or how to fix it, that would be greatly appreciated.  One option I have contemplated is just having each individual row appended to the file instead of resaving the entire array with each iteration.  However, I would prefer to do it the way I am currently doing it, so I can pass the array through each time and perform different tasks on it, such as averaging the values in columns over different time periods.  Thank you in advance to anyone who can help. 

    DWDerr wrote:
    If anyone could tell me why this is happening and/or how to fix it, that would be greatly appreciated.  ...
    This KnowledgeBase article may help.  I think things changed in LV8.20, but I could be wrong.
    =====================================================
    Fading out. " ... J. Arthur Rank on gong."

  • Filter a list-views on page by Date and Time column, not only by Date (Connected Webpart)

    I have a task - create a page and display on it all lists/libraies with documents of team meetings. Every meeting has a documents and another information, which is storing in the same lists/libraries  on website as another meetings. Additionally there
    is a field "MeetingDate" (type: DateTime), which help to indetify a meeting documents.
    I have added all on one page and my custom connected webpart (IWebpartRow interface), which help to filter all views on page by field meeting date. It send a date to another webparts on page, and user see information only for this date.
    It works fine, but the problem is, that it used only DATE (without time) and i need now to filter all by date and time. 
    Unfortunately, i noticed that i cannot filter items by date with time in the standard list too. Please see my screenshot.
    My question is: may be i should use another interface (not IWebPartRow) that to send 2 values (1 - date, 2 - time) and filter all by 2 columns. Is it possible?
    Unfortunatelly i didn't find something like this. I found only a solution to build url like 'http://sp2013/SitePages/Homepage.aspx#InplviewHasha2528afc-50c3-4d5e-aae9-682400976409=FilterField1%3DDatum-FilterValue1%3D2015%252D02%252D27-FilterField2%3DCustomTime-FilterValue2%3D11%253A00'
    , but it looks not so good ..(
    from MSDN forum...

    Hi,
    Per my understanding, you might want to display all the specific documents in a page, they can be filtered by values of a Date and time column.
    A possible solution I will provide is that, we can create custom visual web part to display all these documents. In the code behind, run a query with CAML to retrieved all the
    documents with the specific date and time, then generate a view of hyperlinks that points to these documents.
    In this way, we can combine the filter component with the data display component into one web part, with the help of CAML query and Object Model, it would not be a tough job to
    implement such a solution.
    More information about CAML and Object Model:
    https://msdn.microsoft.com/en-us/library/office/ee536691%28v=office.14%29.aspx?f=255&MSPPError=-2147217396
    Thanks
    Patrick Liang
    TechNet Community Support
    Please remember to mark the replies as answers if they help, and unmark the answers if they provide no help. If you have feedback for TechNet Support, contact
    [email protected]

  • Start time/end time columns in the weekly overview of CATS regular

    Hello,
    Does anyone know if it is possible to have the start and end time columns in parallel of the duration column in CATS regular (iView "record working times" in ESS) in the weekly view. Those columns are available in the daily view but we want them in the weekly view. This is possible on the backend but the question remains for the ESS.
    Thanks a lot for your help
    KR
    Gaëlle

    Dear Gaëlle,
    My client have this requirement as well. So have you found your way to configure it?
    Please let me know..
    Thanks in advance
    Regards,
    Bryan

  • Is "A-Time" column in dbms_xplan.display_cursor summarized ?

    Hello,
    I did some research on the internet before posting this but I could find sufficient information.
    Is "A-Time" column in dbms_xplan.display_cursor summarized ?
    | Id  | Operation                                                    | Name                           | Starts | E-Rows | A-Rows |   A-Time   | Buffers | Reads  |  OMem |  1Mem | Used-Mem |
    |   0 | SELECT STATEMENT                                             |                                |      1 |        |    155 |00:06:36.28 |    4957K|  34952 |       |       |          |
    |   1 |  SORT ORDER BY                                               |                                |      1 |      1 |    155 |00:06:36.28 |    4957K|  34952 | 55296 | 55296 |49152  (0)|
    |   2 |   NESTED LOOPS                                               |                                |      1 |      1 |    155 |00:06:30.04 |    4957K|  34952 |       |       |          |
    |   3 |    NESTED LOOPS                                              |                                |      1 |      1 |    155 |00:06:30.04 |    4957K|  34952 |       |       |          |
    |   4 |     NESTED LOOPS                                             |                                |      1 |      1 |    155 |00:06:30.04 |    4957K|  34952 |       |       |          |
    |   5 |      NESTED LOOPS                                            |                                |      1 |      1 |    155 |00:06:30.04 |    4956K|  34952 |       |       |          |
    |   6 |       NESTED LOOPS                                           |                                |      1 |      1 |    155 |00:06:30.04 |    4956K|  34952 |       |       |          |
    |   7 |        NESTED LOOPS                                          |                                |      1 |      1 |    155 |00:06:30.03 |    4956K|  34952 |       |       |          |
    |   8 |         NESTED LOOPS                                         |                                |      1 |      1 |    155 |00:06:30.03 |    4956K|  34952 |       |       |          |
    |   9 |          NESTED LOOPS                                        |                                |      1 |      1 |    155 |00:06:30.03 |    4956K|  34952 |       |       |          |
    |  10 |           NESTED LOOPS                                       |                                |      1 |      1 |    155 |00:06:30.03 |    4955K|  34952 |       |       |          |
    |  11 |            NESTED LOOPS                                      |                                |      1 |      1 |    155 |00:06:30.03 |    4955K|  34952 |       |       |          |
    |  12 |             NESTED LOOPS                                     |                                |      1 |      1 |    155 |00:06:30.03 |    4955K|  34952 |       |       |          |
    |  13 |              NESTED LOOPS                                    |                                |      1 |      1 |    155 |00:06:30.03 |    4954K|  34952 |       |       |          |
    |* 14 |               HASH JOIN                                      |                                |      1 |      1 |    155 |00:06:30.03 |    4954K|  34952 |   872K|   872K|  927K (0)|
    |  15 |                VIEW                                          |                                |      1 |     15 |      8 |00:06:28.63 |    1305K|  34883 |       |       |          |
    |* 16 |                 FILTER                                       |                                |      1 |        |      8 |00:06:28.63 |    1305K|  34883 |       |       |          |
    |  17 |                  HASH GROUP BY                               |                                |      1 |     15 |      8 |00:06:28.63 |    1305K|  34883 |   760K|   760K| 1077K (0)|
    |  18 |                   VIEW                                       |                                |      1 |     15 |    341 |00:00:50.44 |    1305K|  34883 |       |       |          |
    |  19 |                    UNION-ALL                                 |                                |      1 |        |    341 |00:00:50.44 |    1305K|  34883 |       |       |          |
    |  20 |                     VIEW                                     | V_POSNR_2011000           |      1 |      7 |    303 |00:00:50.44 |     645K|  31282 |       |       |          |
    |  21 |                      UNION-ALL                               |                                |      1 |        |    303 |00:00:50.44 |     645K|  31282 |       |       |          |
    |  22 |                       VIEW                                   | V_POSNR_0200011           |      1 |      2 |     20 |00:00:50.42 |     429K|  31244 |       |       |          |
    |  23 |                        UNION-ALL                             |                                |      1 |        |     20 |00:00:50.42 |     429K|  31244 |       |       |          |
    |  24 |                         NESTED LOOPS                         |                                |      1 |      1 |     20 |00:00:50.42 |     376K|  28979 |       |       |          |
    |* 25 |                          HASH JOIN                           |                                |      1 |      1 |     20 |00:00:50.42 |     376K|  28979 |  1096K|  1096K| 1348K (0)|
    |* 26 |                           TABLE ACCESS BY INDEX ROWID        | PROPERTIES                     |      1 |      6 |   2651 |00:00:00.02 |    2131 |      0 |       |       |          |
    |* 27 |                            INDEX RANGE SCAN                  | P_SETAALDATE_IDX               |      1 |      6 |   2651 |00:00:00.01 |      21 |      0 |       |       |          |
    |  28 |                           VIEW                               | VW_JF_SET$7992605D             |      1 |      2 |    504 |00:02:30.85 |     374K|  28979 |       |       |          |
    |  29 |                            UNION-ALL                         |                                |      1 |        |    504 |00:02:30.85 |     374K|  28979 |       |       |          |( hope this execution plan is reasonably readable )
    I've been thinking the A-Time Columns shows the time of the particular Operation ( summarizing all time from child operations )
    but this seems different:
    |* 25 |                          HASH JOIN                           |                                |      1 |      1 |     20 |00:00:50.42 |     376K|  28979 |  1096K|  1096K| 1348K (0)|
    |* 26 |                           TABLE ACCESS BY INDEX ROWID        | PROPERTIES                     |      1 |      6 |   2651 |00:00:00.02 |    2131 |      0 |       |       |          |
    |* 27 |                            INDEX RANGE SCAN                  | P_SETAALDATE_IDX               |      1 |      6 |   2651 |00:00:00.01 |      21 |      0 |       |       |          |
    |  28 |                           VIEW                               | VW_JF_SET$7992605D             |      1 |      2 |    504 |00:02:30.85 |     374K|  28979 |       |       |          |The line 25 is a HASH JOIN comprising a (1) tableTABLE ACCESS BY INDEX ROWID and (2) the result from a VIEW
    The Timing of the HASH JOIN (line 25) is 00:00:50.42 but the timing for the VIEW is 00:02:30.85 which would both have to complete before hashing can occur
    So I was thinking the HASH JOIN would (at least) have a timing of the 00:02:30.85 from the VIEW plus the 00:00:00.02 from the TABLE ACCESS BY INDEX ROWID.
    But it seems this is a misconception , can someone shed some light on this ?
    best regards
    Edited by: x45r32 on Apr 4, 2012 8:25 AM
    Edited by: x45r32 on Apr 4, 2012 8:46 AM

    x45r32 wrote:
    First please use codes like in sarting of explain plan and at end put the same.
    I've been thinking the A-Time Columns shows the time of the particular Operation ( summarizing all time from child operations )
    Yes its related to time a operation takes.
    but this seems different:
    |* 25 | HASH JOIN | | 1 | 1 | 20 |00:00:50.42 | 376K| 28979 | 1096K| 1096K| 1348K (0)|
    |* 26 | TABLE ACCESS BY INDEX ROWID | PROPERTIES | 1 | 6 | 2651 |00:00:00.02 | 2131 | 0 | | | |
    |* 27 | INDEX RANGE SCAN | P_SETAALDATE_IDX | 1 | 6 | 2651 |00:00:00.01 | 21 | 0 | | | |
    | 28 | VIEW | VW_JF_SET$7992605D | 1 | 2 | 504 |00:02:30.85 | 374K| 28979 | | | |
    The line 25 is a HASH JOIN comprising a (1) tableTABLE ACCESS BY INDEX ROWID and (2) the result from a VIEW
    The Timing of the HASH JOIN (line 25) is 00:00:50.42 but the timing for the VIEW is 00:02:30.85 which would both have to complete before hashing can occur
    So I was thinking the HASH JOIN would at least take the 00:02:30.85 from the VIEW plus the 00:00:00.02 from the TABLE ACCESS BY INDEX ROWID.
    Why do you think like that? HASH JOIN has occured from output result set from View and Index range scan. So building a hash function of output of these is not likely to take the same time i.e 2:30:85, hash join is other set of operation which took place from result set.
    But it seems this is a misconception , can someone shed some light in this ?
    More importantly i could see that there is difference between A-rows and E-rows from sarting of explain plan
    Optimizer thinks that actual cardinality would be 1(During estimation phase) but on real execution optimizer gave 6 cardinality(during run time), this operation is INDEX RANGE SCAN on P_SETAALDATE_IDX. I think due to this mismatch at starting, optimizer has drifted a little and gave wrong estimaition. So consider fixng this first and them move on.
    Please check if optimizer stats are upto date espacially on index P_SETAALDATE_IDX
    I hope this help                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • How to delete the date and time column from a waveform spreasheet file

    after acquiring a signal through the DAQ i am writing it in a spreadsheet. but for further post processing of the data the date and the time column are unneccessary. is there some way to format the file in labview so that those columns are removed. the DAQmx read  gives as output a 1 d array. so i could not find a way to select only the Y column

    If you are getting a 1D array from the DAQmx Read, then you don't have any date/time information to write and if you are using the Write to Spreadsheet File, there should be no way you can have the date/time information in the file. Whatever you hjave programed to put the date/time into the file, don't do it.
    p.s. You may not think you need timing information but for a lot of post-processing, it is essential to have. If it's in the file, you can always ignore when you read it and if you later find out it's necessary, it will already be there.

  • Adding a time stamp

    How do I add a time stamp to video clips using the date and time the video was taken?

    GreenRoom
    What version of Premiere Elements are you using and on what computer operating system is it running.
    Premiere Elements (any version) does not have a built in feature for "adding a time stamp".
    What you can do is use the DVDate Plug-in for Premiere Elements for that task. Please see
    http://dvdate.ru/
    Please read details of the DVDate Plug-in in Item 0014 of my Premiere Elements 12 Daily Discovery blog post
    ATR Premiere Elements Troubleshooting: Premiere Elements 12 Daily Discoveries
    We will be looking forward to news of the outcome.
    Thank you.
    ATR

  • Type of object from a database date/time column

    Hello everyone,
    I've got a concern regarding the way in which coldfusion is
    treating date/time columns in a database.
    In my opinion when I select a column that is let's say
    timestamp type - the object in a query is CF's date/time object.
    I've checked the query by getMetaData() and it looks like each
    column has its own database type (eq. varchar2, date, etc).
    My concern comes from a fact that I've heard that Coldfusion
    is treating the date-time columns in a database as strings and in
    each case it parses them. So the difference in a locale between a
    database server and a Coldfusion server may result in a wrong
    date/time object.
    Example:
    Locale of database EU: yyyy/mm/dd
    Locale of Coldfusion US: yyyy/dd/mm
    The date like 1st of February 2007 stored in a database would
    look then in Coldfusion like: 2nd of January 2007
    Could someone please confirm which version is valid?

    Johnny852 wrote:
    > In my opinion when I select a column that is let's say
    timestamp type - the
    > object in a query is CF's date/time object. I've checked
    the query by
    > getMetaData() and it looks like each column has its own
    database type (eq.
    > varchar2, date, etc).
    if you already know cf is returning a datetime from a
    datetime in the db, what's
    the question?
    > My concern comes from a fact that I've heard that
    Coldfusion is treating the
    "fact"? unless the datetime is stored as a string, that's not
    a "fact".
    > Example:
    > Locale of database EU: yyyy/mm/dd
    > Locale of Coldfusion US: yyyy/dd/mm
    btw neither of your example "locales" are in fact locales.
    > The date like 1st of February 2007 stored in a database
    would look then in
    > Coldfusion like: 2nd of January 2007
    if you pass a numeric string representation (2/1/2007 for
    instance) of a date to
    cf, it will be interpreted via the cf server's "default"
    local (ie en_US,
    month/day/year) unless you use one of the LS functions like
    LSParseDateTime() &
    specifically set the the locale (like setLocale("th_TH")) or
    make the date
    non-ambiguous (2-feb-2007) or build the date yourself from
    user input using
    createDate() or createDateTime(). you just need to convert
    the user's string
    representation to a valid cf datetime object (or i guess
    force to database to
    follow the user's locale date mask).

  • ITunes added extra time to a track

    iTunes added extra time to a track after transferring the song from my computer.  What do I do to to transfer the song correctly and remove the extra (several minutes) of dead air that iTunes added to the end of the track?

    Well....
    First of all, let me confirm exactly what you are doing.
    I notice you say iTunes is not recognising the change. So when you play the song in iTunes, it does not stop at the new time you've entered, is this correct? Or
    is it playing okay in iTunes, but not on your iPod? If this is what's happening, you have Synced your iPod after the change, haven't you? (Any changes you make in iTunes while the iPod is connected to iTunes will almost certainly not be activated on the iPod until you do another Sync.)
    (If you manually manage the music on your iPod, you will need to remove the track from the iPod and then put it back on again. Putting it back on should cause the newly revised stop time to be on your iPod.)
    So if your stop time is shown (in iTunes) as 2:52, but the track is not stopping at that time - in iTunes - then I suggest you do a repair of iTunes. To repair iTunes, download iTunes and go through the instal procedure. (Do not delete or remove the existing iTunes.) Once the instal procedure sees the existing iTunes, you should see an option to either repair or start new, choose the repair option.
    I've had to use the repair iTunes option myself, a few weeks ago, and it fixed the issue I was having. Can't remember what the issue was though!
    Finally, if the track plays okay in iTunes but not on the iPod after a new Sync, the only thing I can suggest is a Restore of your iPod. This will remove everything from the iPod and the next Sync put back what it finds in your library. If you manually manage, you will need to put everything back on yourself.

  • Time column of an explain plan

    Hi,
    I'm using Oracle version 10.2.0.3.0. I have 2 tables with 10 million records each. The DDL is as follows.
    create table bigtable(col1 varchar2(20), col2 varchar2(20))
    create table bigtablechild(col1 varchar2(20), col2 varchar(20))
    bigtablechild.col1 is a foreign key to bigtable.col1. Below is the query and explain plan. Over several executions, the query runs for about 20 seconds before returning results. Could anyone please explain what the time column represents? It doesn't match the time it took to return results.
    SQL> set autotrace on
    SQL>
    SQL> select b.col2
      2  from bigtable a, bigtablechild b
      3  where a.col1 = b.col1
      4  and a.col1 = 'ABC6554';
    COL2
    XYZ6554
    XYZ6554
    XYZ6554
    XYZ6554
    XYZ6554
    Execution Plan
    Plan hash value: 4210396901
    | Id  | Operation          | Name          | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT   |               |     5 |   150 | 21538   (4)| 00:04:19 |
    |*  1 |  HASH JOIN         |               |     5 |   150 | 21538   (4)| 00:04:19 |
    |*  2 |   TABLE ACCESS FULL| BIGTABLE      |     1 |    10 | 13124   (4)| 00:02:38 |
    |*  3 |   TABLE ACCESS FULL| BIGTABLECHILD |     5 |   100 |  8413   (5)| 00:01:41 |
    Predicate Information (identified by operation id):
       1 - access("A"."COL1"="B"."COL1")
       2 - filter("A"."COL1"='ABC6554')
       3 - filter("B"."COL1"='ABC6554')
    Statistics
              0  recursive calls
              0  db block gets
          93672  consistent gets
          91845  physical reads
              0  redo size
            463  bytes sent via SQL*Net to client
            396  bytes received via SQL*Net from client
              2  SQL*Net roundtrips to/from client
              0  sorts (memory)
              0  sorts (disk)
              5  rows processed

    Hi,
    the values in the TIME column are calculated from cost using system I/O statistics. If dbms_stats.gather_system_stats has never been run, then these stats have default values which may be very far from the truth. In your case, the optimizer expects a single-block I/O read to take about 12 ms, while in reality it is closer to 1 ms, thus the discrepancy between the prediction and actual results.
    In general, TIME column is not very helpful not just because of potentially incorrect I/O time estimates, but also because it is hard to predict how much data will be found in cash, so I would recommend not to pay too much attention to it (note, however, that A-time column, on the other hand, is extremely useful, but it's only available if rowsource statistics for the plan have been populated).
    Best regards,
    Nikolay

Maybe you are looking for

  • File Space check error while installing 11.5.8 trial version

    hi every body when i am trying to install oracle 11i trial version i am getting this error my system configuration is 512 MB RAM and 50 GB HDD and windows 2000 on Pentium 4 File Space Check Disk space on system acceptable: Database ORACLE_HOME = c:\o

  • Can't see my pictures after installation on new PC.

    As a student in 2011 I worked with PhotoShop CS5. Now I have finished the studies, bought a new PC och have had help (of an it-expert, but not used to this program) to install the same program. Problem is that I cannot see my pictures at all! When I

  • Which FM used in xml inbound in SUS PO

    Hi Please let me know my queries. a. What is the Function module used for xml inbound ( FOR PO document ) to SUS from MM system via XI system. b .What are the BADIS available for xml inbound operation for SUS scenario from MM system Regards Pooja

  • TS1884 How to go out from safe boot

    I want to leave safe boot

  • How to import a DVD

    I want to import my paid for DVD's onto my computer just like you would import a cd to put on an ipod. The apple service guy said he could not tell me how because it is not in their software agreement at apple, so I'm hoping someone can help me. Basi