Quirk with storage VI's and timestamp data.

This is a minor bug (since it's easy to work around if your aware of it)
I'm running LV 8.0.1 and writing a timestamp through the Write Data Storage VI, will store different times
depending on whether I pass just the timestamp in or the timestamp wrapped in an array. The timestamp
wrapped in an array is accurate, but a single timestamp is by 10 hours behind (and considering that I'm in
the PST (GMT-8:00) timezone), I'm not sure why it's 10 hours off.
I attached a VI that demostrates what is happening. The workaround for now is just to wrap timestamps in arrays
before sending them into the write data storage vi.
Attachments:
Storage VI Bug 2.vi ‏68 KB

Hi,
Thank you for the feedback. I have filed a bug report for this behavior to out development team.
Thanks,
Ankita Agarwal, National Instruments

Similar Messages

  • PO report with cost center, order and delivery date

    Hi,
    pls. I just need a report to display the next fields Account Assigment and Delivery Date.
    I attempted with standard reports like ME2N, ME2K but it is impossible to display all the information in the same screen.
    Thanks a lot for cooperation.
    Regards.

    HI,
    Create a Infoset query or ask ababper
    Tables EKKN, EKPO, EKKO
    Regards,
    Pardeep Mlaik

  • Is it possible to combine a address list with a Pages document and a data base in Numbers, the equivalent of "Mail Merge" using Microsoft?

    Is it possible to combine an address list with a Pages 5.1 document and a data base in Numbers 3.1, the equivelent of Microsoft "Mail Merge"?

    It is possible in Pages 09 and Numbers 09 but not in the versions you name.

  • Please help with restoring different apps and their data between 2 iPads

    I just bought a new iPad Air 2 . I currently have 2 iPads, 1st gen and iPad 2nd, each with its own set of apps and their data. I backed up each older iPad manually as  a full backup. I created a separate library for one of them when doing this. Now I want to transfer iPad 1st gen apps and data to iPad 2nd, so I reset iPad 2nd to factory and then restored from the manual back up of the iPad 1st gen and it seems to work fine. Before that I had backed up iPad 2nd's current data. So now I now I want to transfer the iPad 2nd apps and data to the new iPad Air. I selected the backup by the time stamp that I had backed up iPad 2nd. I didn't know how to rename each iPad to something unique. It restored from backup, but it's all the apps and data from my iPad 1st gen backup, not iPad 2nd.
    I'm utterly confused. Even when I have the separate iTunes library open it shows all the manual backups I performed, so it's not truly a separate library. Do I need to manually install/remove the various apps on the newer iPad?

    If you only have 1 file per quarter, one obvious thing to do would be to edit out the time in between plays, the time-outs, and the like. That is easily done in iMovie.
    Having said that, importing (and exporting) in high definition takes time, and your times do not seem unreasonable.
    Message was edited by: AppleMan1958

  • I have an issue with my 3G connection: my iphone can't connect anymore to the data network even though I have a contract with unlimited internet access and the date/3G is turned ON. Anyone knows this problem ?

    Hi guys,
    This problem showed up a few days ago. Even though I have turned on the 3G and the cellular data, my phone tells me there is actually no connection. But the network indicators shows a maximum network. I went to the genius bar and they fixed it by restoring the phone and it started working again. Then I restored it at home with my back up. Today, it started again: network issue. And the problem does not come from the network (I have tried my sim on another iphone with the same carrier). It really does come from the device.
    Any ideas ?
    Many thanks

    Hello
    According to iTunes: About iOS backups http://support.apple.com/kb/HT4946
    Yes it saves,
    "App Store Application data including in-app purchases (except the Application itself, its tmp and Caches folder).
    Application settings, preferences, and data, including documents."
    Cheers,
    Lima

  • Need a little help with Slimbox (Lightbox clone) and Spry data sets

    Hello guys!
    First of all let me say that I'm not a programmer in any way,
    shape or form, and somehow I managed to build me a dynamic
    thumbnail gallery that reads data from an XML file and displays it
    on my webpage using a Spry data set and a slider widget (yay!)
    This is of course only thanks to the many great examples
    provided by the Adobe Spry team, and me being stubborn enough to
    keep at it, even though I don't really understand what I'm doing :D
    But I got to this point where I have basically everything
    working, except that I can't get the Slimbox (Lightbox clone)
    script to work with the Spry-generated thumbnail gallery.
    From what I could understand from other threads on this
    forum, is that I need to add an observer somewhere, only that I'm
    not sure where and how (those threads are pretty old and the
    examples aren't available anymore).
    I'm sure you guys know what I'm talking about, anyway, here's
    what I got so far:
    http://www.riotdesign.com.ar/misc/gallery/test1.html
    I have the thumbnail gallery populated from the external XML
    file, a basic page navigation using the Sliding Panels widget, and
    the Slimbox script which works only on the static test image.
    Okay I guess that's it for now, sorry for the long post and
    of course any help with this will be GREATLY appreciated :)
    Thanks & bye!

    Kev,
    Where exactly does the .evalScripts = true; text need to go?
    Does it go in the href call?
    <a href="ManageNotes.asp" title="Manage Notes" onClick="this.blur();
    Modalbox.show(this.href, {title: 'Manage Notes', width: 575}); return false;">View your notes.</a>
    Thanks for any assistance.
    J Bishop

  • Help with MIRO Badi's and Translation Date (Exchange Rate Date Reference)

    Dear experts
    This is a problem I have read a lot about, but none of the answers work properly and I wanted to create a new thread in order to try to compile a final answer for this problem (at least in version 6.0)
    As you know Standar SAP uses posting date as translation date in MIRO for foreign currency.
    This is not always true (imagine Crude Imports with 3 dates: Invoice date, posting date and Bill of Lading date. The last one happens to be the fiscal date for exchange rate in my county and the other two are also mandatory).
    I am proposing thus to use 3 dates: Document date as invoice date; posting date as posting date and Invoice Receipt Date
    (REINDAT - RBKP) as Bill of Lading date. I would like to implement this third date as translation date.
    Lot of ways to do it, but none works properly as for the end user it is complicated to enter data in a certain way so that BADI's work properly. I have implemented note 574583 and only works with some restrictions.
    I have also used some more BADI's like MRM_HEADER_CHECK, FI_TRANS_DATE_DERIVE, INVOICE_UPDATE,... and all of them have some restrictions (depending always in data header taps data introduction or saving the exchange rate properly in MM TABLES but not in FI TABLES).
    I would really appreciate if anyone could help with this, ensuring that the Badi get's always the data and makes it work with no screen selection dependance.
    Thanks in advance.

    Dear All,
    I have found the solution with the help of ABAPer. The system has a bug for which SAP has given a  Sap Note no 22781.
    Actually what happened in the Invoice is, the system by default fetched the exchange rate for the Exchange Rate Type 'M' for the rate last maintained. Since we were using different Exchange rate it did not match. Also we came to know about the difference only because off-late the users had stopped updating Exchange Rate 'M' .
    The funny part is in the Invoice if we click on each tab at the header and come back to the first tab where we find the Exchange rate for accounting, the system picks up the actually rate as per the Exchange Rate maintained.
    Regards,
    Karthik.

  • Problem with PCo 2.1 and ODBC data provider

    I am trying to connect MII 12.0 to an Aspen IP.21 data historian using PCo as the universal data server. Configuration steps so far:
    1) On the machine where PCo is installed, I have created an ODBC data source using Aspen's ODBC driver for SQL+.
    2) In the PCo management console I have configured a source system using the Microsoft OLEDB Provider for ODBC Drivers, and selected the ODBC connection that I created. I click "Test Connection" and get a successful result.
    3) In the PCo management console I have configured a destination system that points to an MII server. I put in the appropriate server/port/version/credentials, and click "Test Connection", and get a successful result.
    4)  In the PCo management console I have configured an agent instance connecting the source from step 2 and the destination from step 3.
    When I start the agent instance, I get the red box with a white X in it indicating that the connection was not successful. When I go to the error logs I see:
    Error     .     6     6076     RS1630IP21T01     Host     failed to create ConnectivityAgentProxy     General PCo Fault: The .Net Framework Data Provider for OLEDB (System.Data.OleDb) does not support the Microsoft OLE DB Provider for ODBC Drivers (MSDASQL). Use the .Net Framework Data Provider for ODBC (System.Data.Odbc)
    So, I installed the .NET data provider for ODBC to try to use that instead of the OLEDB provider for ODBC. However, when I try to reconfigure the source system, I do not see the .NET provider as one of my options. This is after a reboot, and starting/stopping all of the different PCo services.
    Any thoughts experts??

    Install PCo 2.2 on a server that has network access to your IP.21 server, and that your destination server (and SAP MII server in my case) has access to.
    In the PCo2.2 Management Console:
    In the "Source Systems" section:
    Create a source system of type "IP21 Agent"
    On the Server Settings tab, provide the server name of your IP.21 server
    In the "Agent Instances" section
    Create an agent instance, picking the source system you just created
    In the Query Ports tab, under port type, select the type of system you will be communicating to. In my case, this is SAP MII, but you may be using something different.
    Also under the query ports tab, you can enter enter a port number if you like. I just accept the default, which is 9000.
    If you are going to "pull" data from the IP.21 server, this is all you need. If you want to "push" data to a destination system from your PCo server, you will need to set it up under the Destination Systems section. In my case, I am doing a data pull, so I haven't done much with destination systems.
    Not sure what your destination system is, but in my case it is MII. For MII here are the basic steps:
    Create a new UDC data server in the MII menu under Data Services -> Data Servers.
    Set the IP address of the data server to the IP address of your PCo server.
    Set the port number of the data server to the port number you set up in the agent instance.
    Make sure the agent instance is started on your PCo server.
    Create a new business logic transaction containing a Tag query, and configure the tag query to use the new MII data server you just created to query the tag(s) you are interested in.

  • Problems with Importing / Exporting Keywords and Meta Datas

    Hi to all!
    I recently upgraded to Aperture 3 and upgraded my referenced library.
    Today I opened the Keyword HUD and noticed some keywords splattered in my list, which seem to be any older ones, since the new numbering indicated that they are not applied to any pictures.
    So I deleted them.
    Than I noticed the <Imported Keywords> folder, opened it, and it contained also a large number of previous keywords. Also they seemed not to be in use, so I also removed them.
    Than I locked the Keyword HUD.
    Now my question: If I export a version, with the option for 'include metadata' ticked, and edit the version than in Photoshop, and after it, import it back into the Aperture library I do have the problem.
    I have tried 'Import Meta Datas' and and click 'Append'. Than it is recognizing the former keywords, which I would appreciate, but not as 'Imported Keywords'.
    If I would open the file w/ the External Editor and return it back, I guess I would not have this problem. Usually I do open the referenced RAW file, and import than the edited version back into Aperture. Keywords are stored in the Library, so I would not avail the former given keywords than, is that right?
    By the way are the keywords part of Meta Datas or not?
    Are there any workarounds?
    And had other people also problems with their keyword list after upgrading?
    Thank for any ideas / infos!
    Michael

    Added header to CSV and to Code
    $ImportFile = Import-Csv "C:\Users\username\Desktop\Scripts\Powershell\Epic\SCCM CI\Tags.csv" -Header Computer
    foreach ($Computer in $ImportFile){
    $path = "\\$Computer\c$\Epic\bin\7.9.2\Epic Print Service"
    $xml = select-xml -path "$path\EpicPullService.config.xml" -xpath //EpicPullService//Cleanup | Select -ExpandProperty Node
    if ($xml.ArchiveHours -eq '12' -and $xml.DeleteHours -eq '120') {
    $Compliance = $True
    }Else{
    $Compliance = $False
    } "$Computer","$Compliance" | Export-Csv "C:\Users\username\Desktop\Scripts\Powershell\Epic\SCCM CI\Results.csv"
    Results:
    select-xml : Cannot find path '\\@{Computer=SW1412-16985}\c$\Epic\bin\7.9.2\Epic Print Service\EpicPullService.config.xml' because it does not exist.
    At C:\Users\username\Desktop\Scripts\Powershell\Epic\SCCM CI\Check_PullServiceXML.ps1:4 char:8
    + $xml = select-xml -path "$path\EpicPullService.config.xml" -xpath //EpicPullServ ...
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
        + CategoryInfo          : ObjectNotFound: (\\@{Computer=SW...vice.config.xml:String) [Select-Xml], ItemNotFoundException
        + FullyQualifiedErrorId : PathNotFound,Microsoft.PowerShell.Commands.SelectXmlCommand
    If it is not a CSV file then just get it with Get-Content
    Get-Content C:\Users\UserName\Desktop\Scripts\Powershell\Epic\SCCM CI\Tags.csv |
        ForEach-Object{
            $computer=$_
            $path ="\\$computer\c$\Epic\bin\7.9.2\Epic Print Service\\EpicPullService.config.xml"
    ¯\_(ツ)_/¯

  • Problems with OC4J EJB deployment and the data-sources.xml file

    I am running 2 Windows 2000 Machines one with the 8.1.7 database another with iAS 1.0.2.2.1 and OC4J.
    I am trying to deploy a 3rd party EJB-based application whic seems to have deployed successfully except when I try to test the EJB deployment via a jsp it can't connect to the database, giving the error:
    1/16/02 4:52 PM VerySimple: Servlet error
    java.lang.NoClassDefFoundError: com.netexp.user.UserManagerHome
    at com.netexp.beans.BeanHelper.class$(Unknown Source)
    at com.netexp.beans.BeanHelper.getUserManagerBean(Unknown Source)
    at /very_simple.jsp._jspService(/very_simple.jsp.java:48) (JSP page line 27)
    at com.orionserver[Oracle9iAS (1.0.2.2.1) Containers for J2EE].http.OrionHttpJspPage.service(OrionHttpJspPage.java:54)
    at com.evermind[Oracle9iAS (1.0.2.2.1) Containers for J2EE].server.http.HttpApplication.serviceJSP(HttpApplication.java:5459)
    at com.evermind[Oracle9iAS (1.0.2.2.1) Containers for J2EE].server.http.JSPServlet.service(JSPServlet.java:31)
    at com.evermind[Oracle9iAS (1.0.2.2.1) Containers for J2EE].server.http.ServletRequestDispatcher.invoke(ServletRequestDispatcher.java:508)
    at com.evermind[Oracle9iAS (1.0.2.2.1) Containers for J2EE].server.http.ServletRequestDispatcher.forwardInternal(ServletRequestDispatcher.java:177)
    at com.evermind[Oracle9iAS (1.0.2.2.1) Containers for J2EE].server.http.HttpRequestHandler.processRequest(HttpRequestHandler.java:576)
    at com.evermind[Oracle9iAS (1.0.2.2.1) Containers for J2EE].server.http.HttpRequestHandler.run(HttpRequestHandler.java:189)
    at com.evermind[Oracle9iAS (1.0.2.2.1) Containers for J2EE].util.ThreadPoolThread.run(ThreadPoolThread.java:62)
    I try to update the file using the installdatasource switch thus:
    C:\Oracle\iSuites\j2ee\home>java -jar admin.jar ormi://localhost admin adm_pwd -application apptricityII -installDataSource -jar %ORACLE_HOME%\jdbc\lib\classes12.zip -url jdbc:oracle:thin:@db_host.unitas.com:1521:db_name -connectionDriver oracle.jdbc.driver.OracleDriver -location jdbc/pool/OracleDataSource -username scott -password tiger
    And get the following error:
    Exception passing by from remote server: java.lang.InstantiationException: No class specified for jdbc/pool/OracleDataSource
    java.lang.InstantiationException: No class specified for jdbc/pool/OracleDataSource
    <<no stack trace available>>
    Error adding source: No class specified for jdbc/pool/OracleDataSource
    Please advise.
    I have followed the instructions in 'EJB Primer' and 'Using Oracle9iAS Containers for J2EE' to no avail. And I can't find any meaningfull data-sources.xml samples. I have been struggling with this for some time.
    Is there any other file, do I need to bind anything?
    I'd appreciate your assistance.
    Thank you
    Louiza

    Hi Louiza,
    Show us your web.xml and ejb-jar.xml files as well as your
    data-source.xml file.
    Thanks,
    Avi.

  • Aggregates with Attribute Change Run and Master data

    Hi !
    I created Aggregate which includes Master Data such as Customer , Customer Number , Material , Material Number.....
    It also contains navigational attribute and hierarchy...
    Now we all know that if Master data and Hierarchy changes frequently(or some new values are added or new attributes are added etc.) we want to see the respective change in Aggregates too
    So just Rolling up aggregate will not help.
    We need to apply Attribute Change run for this purpose, to aggregates..
    Now the question is HOW TO  APPLY ATTRIBUTE CHANGE RUN to aggregates?
    How to automate this process with Process chains..?
    If i create Aggregate on Master Data CUSTOMER NO., DOES IT AUTOMATICALLY INCLUDE THAT AGGREGATE IN ATTRIBUTE CHANGE RUN FOR THAT CUSTOMER NO. PROCESS CHAINS ?   Yes or No..........
    (What i mean to say is if there is Attribute Change Run for Customer No. and if this is in Process chains specially created for Customer Number , than as aggregates are created on  Customer No., does i t automatically applies and make changes to Aggregate too, or do we hvae to create special process chian for it ?)
    Please reply asap ? its urgent

    hi,
    check these links for attribute change run
    What's the attribute change run? and the common sequence of a process chain
    http://help.sap.com/saphelp_bw30b/helpdata/en/80/1a67ece07211d2acb80000e829fbfe/content.htm
    regards
    harikrishna N

  • Problems with driver 10.020 and returning data piped together for reports

    Hi,
    We have recently upgraded our UAT database to Oracle 10g and i have been trying to test reports that we have running against the database. I have come across a problem when piping ("||") together columns to return a single column e.g.
    SELECT filed1||','||field2||','||field3 FROM Dual
    It seems that if a certain number of columns is returned it will return the data with each character separated with a CHR(0). Does anyone know why this is happening as it doubles the size of reports as the is double the amount of characters!!
    The only way i have managed to sort of fix it is to return smaller concatinated blocks, is there a limit on piping fields or different datatypes together?
    Thanks
    PM

    Here is one of our report SQL statements. I have added a carriage return where i have added a comma to create a new column.
    select ' Date,Centre,Division,Time_Taken,',
    'Putaways,Case_Putaway,Replens,Case_Replens,Total Orders,R20_Pallets_Picked,R20_Case_Picked,',
    'R20_Total_Pallets,R30_Case_Picked,R30_Lines_Picked,R30_Pallets_Picked'
    from dual
    union
    select inv.dstamp||','||inv.centre||','||inv.colour||','||
    decode(length(to_char(floor(sum(elapsed_time)/60/60))),1,'0','')||floor(sum(elapsed_time)/60/60)||':'||
    decode(length(to_char(mod(floor(sum(elapsed_time)/60),60))),1,'0','')||mod(floor(sum(elapsed_time)/60),60)||':'||decode(length(to_char(mod(sum(elapsed_time),60))),1,'0','')||mod(sum(elapsed_time),60)||',',
    count(decode(inv.code,'Putaway',inv.tag_id))||','||
    round(nvl(sum(decode(inv.code,'Putaway',cases)),0))||','||
    count(decode(inv.code,'Replenish',inv.tag_id))||','||
    round(nvl(sum(decode(inv.code,'Replenish',cases)),0))||','||
    count(distinct decode(inv.code,'Pick',inv.reference_id))||','||
    count(decode(inv.code||substr(loc.work_zone,1,3),'PickR20',inv.tag_id))||',',
    round(nvl(sum(decode(inv.code||substr(loc.work_zone,1,3),'PickR20',cases)),0))||','||
    count(decode(substr(loc.work_zone,1,3),'R20',inv.tag_id))||','||
    round(nvl(sum(decode(inv.code||substr(loc.work_zone,1,3),'PickR30',cases)),0))||','||
    count(distinct decode(inv.code||substr(loc.work_zone,1,3),'PickR30',
    inv.reference_id||inv.line_id))||','||
    count(distinct decode(inv.code||substr(loc.work_zone,1,3),'PickR30',inv.pallet_id))
    from
    (select 'RETAIL' as centre, 'BABY' as colour
    from dual
    union select 'RETAIL' as centre, 'MEDICAL' as colour
    from dual
    union select 'HOMEWARD' as centre, 'BABY' as colour
    from dual
    union select 'HOMEWARD' as centre, 'MEDICAL' as colour
    from dual
    union select 'RETAIL' as centre, 'UNKNOWN' as colour
    from dual
    union select 'HOMEWARD' as centre, 'UNKNOWN' as colour
    from dual) cd,
    (select
    TO_CHAR(inv.dstamp, 'YYYY-MM-DD') AS dstamp,
    DECODE(OH.User_Def_Type_1, '20', 'HOMEWARD',
                   DECODE(OH.User_Def_Type_1, '22', 'HOMEWARD',
                   DECODE(OH.User_Def_Type_1, '23', 'HOMEWARD',
                   DECODE(OH.User_Def_Type_1, '24', 'HOMEWARD', 'RETAIL')))) AS Centre,
    NVL(s.colour, 'UNKNOWN') AS colour,
    inv.user_id, inv.elapsed_time, inv.code,
    inv.tag_id,
    decode(inv.code,'Replenish',inv.from_loc_id, 'Pick',inv.from_loc_id,
    'Putaway',inv.to_loc_id) as LOCATION_ID,
    decode(inv.update_qty,0,1,inv.update_qty)/pac.ratio_1_to_2 as cases,
    inv.reference_id, inv.line_id, inv.pallet_id
    from
    inventory_transaction inv, order_header oh, sku s, sku_config pac
    where
    inv.client_id = oh.client_id (+)
    and inv.reference_id = oh.order_id (+)
    and inv.site_id = oh.from_site_id (+)
    and inv.client_id = s.client_id
    and inv.sku_id = s.sku_id
    and inv.config_id = pac.config_id
    and inv.code in ('Replenish','Pick','Putaway')
    and inv.dstamp BETWEEN TO_TIMESTAMP(TO_CHAR(NEXT_DAY(SYSDATE,'MONDAY')-interval '14' day,'DD/MM/YYYY')||' 00:00:00','DD/MM/YYYY HH24:MI:SS.FF')
    AND TO_TIMESTAMP(TO_CHAR(NEXT_DAY(SYSDATE,'SUNDAY')-interval '7' day,'DD/MM/YYYY')||' 23:59:59','DD/MM/YYYY HH24:MI:SS.FF')
    and inv.client_id = 'NUTRICIA'
    and inv.site_id = 'WAR01') inv,
    location loc
    where cd.centre = inv.centre (+)
    and cd.colour = inv.colour (+)
    and inv.location_id = loc.location_id
    and loc.site_id = 'WAR01'
    and (loc.work_zone like 'R20%' or loc.work_zone like 'R30%')
    group by
    inv.dstamp, inv.centre, inv.colour

  • After updating iPod to iOS 5.0, cannot get into Game Center with previous nickname. It says my nickname is already in use by another account.  Why did this happen and how can I fix this with my old nickname and its data?

    After updating iPod Touch to iOs 5.0, I cannot get into Game Center with previous nickname.  The screen says my nickname is already in use by another account.  Why did this happen and how can I fix this so I can still use the same nickname?

    Did you haveany problem with updating to iOS 5? It soundsl like the update did not correctly restore from the backup that iTunes mnakes as the fisr step of the update.

  • Please Help: Trouble with nested CASE statement and comparing dates

    Please tell me why the query below is always returning the bold null even when the start_date of OLD is greater than or equal to the start_date of NEW.
    What I want to do is get the difference of the start_dates of two statuses ( Start_date of OLD - Start_date of NEW) if
    1. end_date of NEW is not null
    2. start_date of OLD is greater than start_date of NEW
    else return null.
    select id,
    case when max(end_date) keep (dense_rank last order by decode(request_wflow_status,'New',1,0),start_date) is null then
    null
    else
              case when max(decode(status,'OLD',start_date,null)) > max(decode(status,'NEW',start_date,null))
              then max(decode(status,'OLD',start_date,null)) - max(decode(status,'NEW',start_date,null))
    else
    null
    end
    end result
    from cc_request_status where id =1
    group by id;

    Avinash,
    Thank you for your help.. Here is a more description of my problem..
    Here is a sample of data I have for a table with four columns (id,status,start_date,end_date)
    What I need to do is to get difference of the start dates of the maximum available dates, if data is valid. The pseducode is as follows:
    IF end_date of New status is null
    THEN return null
    ELSE
    IF start_date of old >= start_date of new
    THEN return (start_date of old - start_date of new)
    ELSE return null
    I used the following query but always return the bold null
    select id,
    (case when max(end_date) keep (dense_rank last order by decode(status,'new',1,0),start_date) is null then
    null
    else
              (case when max(decode(status,'old',start_date,null)) >=
              max(decode(status,'new',start_date,null))
              then max(decode(status,'old',start_date,null)) - max(decode(status,'new',start_date,null))
    else
    null
    end)
    end) result
    from tbl where id =1
    Based on the below sample, I expected to get the following result; 14-Mar-07 - 16-Feb-07 which is the difference of the maximum start_dates of the two statuses. However the query is not working.. Please help me.. Thank you..
    Id    Status    start_date      end_date
    1     new      03-Feb-07      07-Feb-07
    1     new      16-Feb-07      21-Feb-07
    1     old      '10-Mar-07      12-Mar-07
    1     old      '14-Mar-07      16-Mar-07

  • Help with plotting NMEA latitude and longitude data on a map

    Hello,
    I am working on developing a GPS application. I have suceeded in retrieving and interpreting the NMEA data to the required longitude and latitude information.
    can anyone tell me how i can show this information on the required map in a windows application ( waypoints/ tracks) what kind of maps would be suitable and how so i interface my NMEA data on the maps.
    All information would be duely appreciated no matter how trivial
    thanks

    {color:red}{size:20px}CROSS POSTED{size}{color}
    [http://forums.sun.com/thread.jspa?threadID=5324397]
    Cross posting is rude.
    db

Maybe you are looking for

  • How many people can I share with?

    Hi. Is there an upper limit on the number of people I can share docs with? You have stated 'as many people as you like' in your blurb, is this really the case? Btw, I am only thinking 100 to 200 but wondered if one could go into the thousands. Thanks

  • How can i upload items in the product catalog ?

    Hello , How can i upload items in the product catalog? ( Other than running the concurrent program to import the items from inventory module) 

  • Apple TV sound problem

    Does anyone have any suggestions on why I can get sound and picture when streaming from the clouds but picture only when using iplay from my pad?

  • HT201209 how do i fine out my account balance when i redeem gift cards

    If I redeem gift cards, where can I see my account balance?  Also, if I purchase apps, games, etc., How does it know to use the gift card vs. my credit card on file? Thx!

  • Mailbox is on a different server

    Hi, I had successfully configured JES 2005 Q1 Messaging+ Calendaring + Delegated Administration on top of Directory and Access Manager on Sol-10 x-86. It was working perfectly fine until, I hyad a necessity to configure the same LDAP as a Native Sola