Using OracleXMLSave for nested date

I am trying to insert an XML document containing the following date structure into Oracle using the OracleXMLSave class. Is there a way to do this without reformatting the date into a single element beforehand? Here is the stucture.
<expiration-date>
<expiration-year>2000</expiration-year>
<expiration-month>12</expiration-month>
<expiration-day>31</expiration-day>
</expiration-date>
Any help appreciated!
Kurt

Hi try this
sql> select to_char(sr.recievedat, 'FMDD Mon yyyy fmHH24:MI') from dual;example
SQL> select to_char(to_date('01-jan-2009 7:01:06','dd-mon-rrrr HH:MI:SS'), 'FMDD Mon yyyy fmHH24:MI'
) from dual;
TO_CHAR(TO_DATE('01-JAN
1 Jan 2009 07:01

Similar Messages

  • What are Parameters? How are they differenet from Variables? Why can't we use variables for passing data from one sequnece to another? What is the advantage of using Parameters instead of Variables?

    Hi All,
    I am new to TestStand. Still in the process of learning it.
    What are Parameters? How are they differenet from Variables? Why can't we use variables for passing data from one sequnece to another? What is the advantage of using Parameters instead of Variables?
    Thanks in advance,
    LaVIEWan
    Solved!
    Go to Solution.

    Hi,
    Using the Parameters is the correct method to pass data into and out of a sub sequence. You assign your data to be passed into or out of a Sequence when you are in the Edit Sequence Call dialog and in the Sequence Parameter list.
    Regards
    Ray Farmer

  • Use Firefox for sensitive data & use a virtual keyboard plus internet antivirus. This current version does not allow such or an extension . Can you make provision for this?

    Use Firefox for sensitive data & use a virtual keyboard plus internet antivirus. This current version does not allow such or an extension . Can you make provision for this?

    Use Firefox for sensitive data & use a virtual keyboard plus internet antivirus. This current version does not allow such or an extension . Can you make provision for this?

  • LSMW used only for master data upload?

    Hi
    Can you please let me know if LSMW is used only for master data upload or we can also use it for transaction data ?

    Hi Christino.
    I have come across a standard SDN thread which deals with the uploading master data, refer it:
    [SDN Reference for uploading master data using LSMW|how can we upload master data by using LSMW;
    [SDN reference for which uploading is preferred (Master data or Transaction data)|Which one is better for uploading data LSMW or ECATT ?;
    Good Luck & Regards.
    HARSH

  • Use case for financial data

    Hi All,
    I've a question about potential use case for Oracle spatial. Data structures are following:
    Clients
    Account (have a dimension of balance, can be zero or above zero)
    Client to account relationship
    E.g.
    Client C1 is a borrower to Account A1 (balance = 0)
    Client C1 is a co borrower to Account A2 (balance > 0)
    Client C2 is a co borrower to Account A1 (balance > 0)
    Client C3 is a co borrower to Account A3 (balance > 0)
    Currently, database is modeled as a set of three tables, e.g.
    Client
    ID
    DATA
    Account
    ID
    DATA
    BALANCE
    CLIENT_TO_ACCOUNT
    CLIENT_ID
    RELATIONSHIP (E.g borrower)
    ACCOUNT_ID
    Business limitations:
    We are not interested in independent graphs for which all accounts have balance = 0 (let's call it inactive graph), however we might need occasionally query it
    Users are interested in vertices/edges with account which have balance = 0, but linked (up to level N) to active account for analysis purposes
    There is no well defined root (e.g. there can be 2 or more clients which are co borrowers to same account)
    99% of queries will be against active graphs
    Graphs are mutable, e.g. new relationships (edges) may be created/deleted during the day
    Users are potentially interested in free navigation in whole independent graph, starting from root.
    Root is determined by certain business rule
    Need to process active graphs daily as bulk
    Problems which I am trying to solve:
    Limit the amount of data which may need to be processed - based on the analysis of current system, we only need 5% of data + some delta for 99% processing
    Make sure performance does not degrade with time as we get more historical (processed data) - we can not deleted accounts with balance = 0 as potentially new relationship may arrive with new accounts with balance > 0
    Current solution that I am thinking of :
    Artificially partition the data universe as active and inactive graphs. All indexes would be local to two partitions.
    E.g.
    GROUP
    GROUP_ID PK
    ACTIVE_FLAG (partition key)
    CLIENT
    GROUP_ID (PARTITION BY FK TO GROUP)
    ACCOUT
    GROUP_ID (PARTITION BY FK TO GROUP)
    CLIENT_TO_ACCOUNT
    GROUP_ID (PARTITION BY FK TO GROUP)
    The issues I am seeing right now:
    1. Graphs(groups) may be potentially unlimited, so I will need a artificially limit the size using some dividing algorithms - leading to
    2. Graphs(groups) may need to be joined or divided
    3. Graphs(groups) will have to be activated/deactivated - e.g. moved to different partitions.
    4. Data loading, activation/deactivation algorithms are not simple
    So I am thinking about Oracle Spatial (Network) to model this problem.
    Questions:
    1) Can I model this problem using Oracle Spatial?
    2) Will I gain any performance improvement?
    3) Is there any explanation or white paper on how to do this for this particular type of problem?
    4) Will the solution based on Oracle Spatial solve the problems outlined above?
    5) Will my solution (without using Oracle spatial) work at all? Or there are some fundamental issues..
    Thanks you!

    Either add a LOV to the JobID attribute definition in the VO (if the JobID will be editable) or simply add the job description to the select statement (join to the job table) as a reference attribute

  • Where Used List for Master Data Entries

    Hi folks,
    I am looking for a FM, method, etc. that gives me a list, that shows, where a certain master data entry of an InfoObject is used. The BW system makes this check implicitly, when trying to delete master data, but I couldn't get behind the logic yet.
    Anyone here with helpful hints?

    Hi Durgesh,
    Thank you for your answer!
    Unfortunately the two mentioned FMs are not helpful for me. I am looking for a where-used-list of master data entries, but not of InfoObjects. For example I am looking in which InfoCubes the measure pieces of InfoObject '0UNIT' is used.

  • Time Machine: removing the backup drive and using it for other data

    I am trying to stop the time machine backups of my data and use my External HD for other data. Ia m new to the MAC world and I need to know how to do this. Almost like I want to shut Time Machine down and reformat the disk that was being used for the backups. Thanks for the help.

    The big switch in the Time Machine preference pane stops Time Machine from performing automatic backups. I would be inclined to recommend you switched that to off but left the disk assigned as designated backup disk on the Change Disk panel because, if there is nothing assigned there, Time Machine will ask you every time a disk is attached which it could use whether you wish to so assign it.

  • Flex 3  :How to use trace for printing data in console

    Hi ,
    I heard that we can use trace to print data on to Flex Builder 3 console . But when i tries it was of no luck .
    The below is a simple program , in which i was out of luck .
    public function callMe():void
                trace("AAA");
    <mx:Button id="Register" name="Register" label="Register" height="23" click="callMe()"/>
    Here in the above porogram , after clicking the Button , i cant see 'AAA 'related  inside my Flex Builder .
    Any help ??
    Thank you .

    Hi Kiran
                Make a break point at trace line and debug the application  There u can find the message u typed in console..  trace works only under debugging mode... not in development mode ..
               Have a nice day
    Thanks
    Ram

  • Why should we use BRTOOLS for  oracle data backup?

    Hi there,
    Any one of you have any idea on specific advantages with using BRTOOLS for backup over using only Oracle RMAN?
    Are there any advantages using brbackup in databackups which we do not get with RMAN alone?
    thanks
    Arlin

    i seriously doubt you want to use wsdl4j unless you are doing really advanced webservice work. assuming you are developing this webservice from scratch, you basically want to use JAXWS: define an appropriate interface and your value classes, and let JAXWS do the rest. metro is the JAXWS implementation included in the oracle jdk and it has great tutorials and reference documentation online. i'd suggest you start here: http://metro.java.net/getting-started/

  • Input for nested data structure service

    I plan to use a service which require some header-information and additional 1...n item-information.
    The data structure of this service is nested, means there is only one input-port for the header AND the nested items.
    Is it somehow possible to create a UI consists of two inputs like one input form for the header and one input table for the items on the inputport??
    The main problem is that either the header or the item information is transported to the input port of the service, never both as required by the service. Tried to solve it using a data bridge (or as it called in Ehp1 for 7.1 "data share") without any success.
    Solved it by myself Just forgot to check "Control Buttons" in the Configure tab of the table.
    Edited by: Stefan Witschel on Nov 18, 2008 1:47 PM

    Yes, I still used it.
    In Ehp1 is a feature that automatically add a data share if you drag and drop the input port of your servce with nested structure anywhere on the screen. On this data share you can add another input form or table for the nested elements. Both forms/tables than are connected at the input with the share.
    Finally you do the data mapping for the nested structure and define an action to call this connection.
    Actually my problem was that I couldn't input data in rows of a table for the nested structure. I solved it by enabling the table controls which allows you to add an delete rows.

  • NEED HELP IN USING ALL_TAB_COLUMNS FOR RETRIEVING DATA???

    A table say T1 contains column like Emp_id,Code.
    and there are several Code like C1,C2,C3.
    Another table say T2 contains column like
    Emp_id,C1,C2,C3.Here the value of the code field of the
    T1 table is now column of the T2 table.And the amount of
    each code of T1 table is equal to column value of T2
    table.
    Now I want to retrieve data from T2 table like
    C1 200
    C2 300
    C3 140
    I cannot retrieve data like this using all_tab_columns.
    I can only getting the column_name but cannot its value.
    PLEASE HELP ME...
    Edited by: user630863 on Apr 8, 2009 11:37 AM

    emp_id | code
    001 | C1
    001 | C2
    005 | C3
    005 | C1
    002 | C2
    002 | C3
    Table T1
    emp_id | C1 | C2 | C3
    001 | 10 | 15 |
    002 | | 7 | 12
    005 | 45 | | 94
    Table T2
    I have written a query
    select column_name from all_tab_columns a,T1 b
    where a.column_name=b.code
    and table_name='T2'
    OUTPUT:
    C1
    C2
    C3
    But I Need data for each employee like
    001 C1 10
    001 C2 15
    002 C2 7
    002 C3 12
    005 C1 45
    005 C3 94
    Edited by: user630863 on Apr 8, 2009 1:28 PM

  • How to use NAS for copying data on?

    Well, this may sound like a simple question, but apparently it's not as simple as it seems.
    I have a Raidon SL3620 NAS. I managed to set it up with help of one of our community members.
    However, now I found the problem of not being able to copy all my data on there. I want to copy my movies etc to the NAS, so I created a user (myself) and a group (called public), with no shared space limits. I granted access to all to make things easy, and after the harddrives were formatted, I drag-dropped the relevant folders into the new user directory on the NAS.
    First of all, it predicted some 100 hours to complete the task (and my iMac only has a 1TB harddrive), whilst my Mac and the NAS are both connected via ethernet. I find that unlikely.
    Then when I woke up this morning, it had aborted the transfer due to lack of space. It has 2x 2TB storage in RAID 1, and as I said, my iMac has a 1TB HD. So why does it say it's full after 900MB??
    Also, when I then decided to remove the data, I was told some files were in use. Which surprised me because the copy was cancelled. I checked airport utility and no one was using it, so I wonder what's happening there!
    Can anyone help me out?

    right, I think this was caused by using a wireless connection, which takes longer than smoke signals. Using ethernet seems to have solved the issue for now

  • How to use exception for a Date Key Figure

    Hello All,
                    I have the following requirement.
    1. I have a Key Figure which is Date Type.
    2. I need to color the cell to green if the it is filled with date otherwise leave it as it is.
    Please suggest how to overcome it.
    Thanks & Regards,
    Rajib

    hi,
    Your requirement is not clear, you have the below setup
    I have the following requirement.
    1. I have a Key Figure which is Date Type.
    How can a KF be of date type, or is it the value of a date characteristic that you have extracted in KF using formula variable. If yes you just need to define exception for the value greater than 0.
    regards,
    Arvind.

  • Should we be using RAC for a data warehouse?

    We have an Oracle 11.1 data warehouse system. We were having some performance issues with the system so we shutdown one of the RAC nodes, to see if that was causing the problem. The problem was slow updates on a table (all 30+ million rows on one table had to be fixed). One other perforamnce problem is queries of large partitoned tables (even if the partitioin key is used). Both bulk collect and bulk inserts are very fast.
    Question: for a 11.1 data warehouse system should we use RAC? Why?
    Thank you...

    a school of thought that suggests RAC potentially decreases system availability, rather than increasing it.RAC also has the potential of increasing availability. The potential "cuts both ways", so to speak.
    I've worked with non-RAC and RAC databases on a variety of platforms. My experience doesn't show evidence that RAC decreases availability. Given that most servers, even in non-HA clusters, are very reliable (generally), downtime is low in both non-RAC and RAC environments. However, RAC does provide an availability advantage -- protection against node outage. And there are environments which do require the avaialability of RAC. Not all applications require it. RAC is too oversold, not in terms of advantages but in terms of installations.
    the increased complexity and the increased risk of both software and human related errors in a RAC environmentI would say that a similar argument arises in DASD v SAN. A SAN is more complex. Human error on a SAN causes a much higher cost. Human error does occur on a SAN. However, no one rejects a SAN on these grounds alone.
    RAC is complex to implement. It requires more skills to adminster and diagnose. However, if it is setup well, it doesn't suffer outages. An outage from human error is the same as in a non-RAC environment.
    The issue isn't RAC. The issue is that too many customers buy RAC without evaluating seriously whether
    a. they need the additional minute increase in availability
    b. whether there applications are "RAC-aware" {TAF is still misunderstood}
    c. whether they have the skills
    RAC provides scalability. It also provides HA. Let me say that again : It also provides HA.
    I've seen a high end Failover Cluster environment where one of the "best" vendors in the world talked of a 10-30minute outage for the Failover.
    Hemant K Chitale
    http://hemantoracledba.blogspot.com
    Edited by: Hemant K Chitale on May 31, 2009 11:41 PM

  • Is it safe to use iCloud for private data?

    One of my pals (an infinite fan of Apple) was just speculating about iCloud usability, even though he always maintains quite a high level of his data protection. Passwords, passcodes, password keepers and managers, data security - that’s what he stakes on! With that he didn’t even suppose that a very quick connection to unprotected public Wi-Fi network in McDonalds can lead to data leakage.  We were sitting in such one. He couldn’t believe my words, but when I got to know his Apple ID and a password, I showed how easily all his work and private life appeared at my disposal, whereas his iPhone was still lying in his pocket, I didn’t even touch it all this time. Now I could get all his SMS, pictures and all his geolocation data, even new ones just within a little time difference. Finally, my friend got this FBI-syndrome that he is being watched and so on. He even started dwelling on global sneaking spyware and the very next day when we met, he was carrying a simple brick-phone. The point here is that even iPhone and public Wi-Fi are incompatible, because even iPhone can provide only relevant security, and even if it’s not jailbroken and has only licensed software, you can easily become a victim because of Apple software shortcomings. And that is why right after Elcomsoft updated their EPPB with this new feature which can download iCloud data IBM officially prohibited their employees using both iCloud and iPhones for business. Do you think you are being watched after? Oh, perhaps you are.

    Yes. Just make sure you don't make ANY typos.

Maybe you are looking for