How to transform DATE?

Hello!
Now I do "select SYSDATE from dual" and receive "24-MAY-01".
How I can receive date and time in other format by default without "TO_CHAR(SYSDATE,Format)"?

change u're default date format by
ALTER SESSION SET NLS_DATE_FORMAT='format_string';

Similar Messages

  • How to transform data received from file adapter

    hi',
    I am reading the data from XML file using file adapter, now I want to write the same contents which I have read from this file to a different/new XML file, the issue is I am unable to perform a transform here, when I use a transform with source element as the output of the file read and target as the input to the new file it is writing an empty file,
    I have checked the audit trail which showsme that the transform is empty.
    please tell me how can I transform the element which I have received from one file to make them as input to next file
    thanks
    Yatan

    thanks James for input, this is my XML and XSD files
    XML file:
    <?xml version="1.0" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns="http://www.example.org"
    targetNamespace="http://www.example.org"
    elementFormDefault="qualified">
    <emp>
    <name>yatan</name>
    <age>28</age>
    </emp>
    </xsd:schema>
    XSD File:
    <?xml version="1.0" encoding="windows-1252" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns="http://www.example.org"
    targetNamespace="http://www.example.org"
    elementFormDefault="qualified">
    <xsd:element name="readfile">
    <xsd:complexType>
    <xsd:sequence>
    <xsd:element name="name" type="xsd:string"/>
    <xsd:element name="empid" type="xsd:string"/>
    </xsd:sequence>
    </xsd:complexType>
    </xsd:element>
    </xsd:schema>
    I am reading XML file, I will describe my process.
    1.) File adapter>read operation ---> receive activity
    2.) Transform > source(receive input variable) <--->target(Invoke input variable)
    3.) File adapter > write operation ---> invoke activity
    Now when I am deploying this process it is sucessfully completing and able to read the file.
    when I check the audit trail receive activity shows XML data but when I check the transform it
    shows empty as below, however the write activity is performed which creates a xml file but no values in the elements
    transform in audit trail
    <Invoke_1_Write_InputVariable>
    -<part xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" name="readfile">
    -<readfile xmlns:ns0="http://www.example.org" xmlns="http://www.example.org">
    <ns0:name/>
    <ns0:empid/>
    </readfile>
    </part>
    </Invoke_1_Write_InputVariable>
    this is the code for my .bpel file
    <?xml version = "1.0" encoding = "UTF-8" ?>
    <!--
    Oracle JDeveloper BPEL Designer
    Created: Wed May 19 15:04:22 IST 2010
    Author: yatanveer.s
    Purpose: Empty BPEL Process
    -->
    <process name="ReadXMLFile2"
    targetNamespace="http://xmlns.oracle.com/ReadXMLFile2"
    xmlns="http://schemas.xmlsoap.org/ws/2003/03/business-process/"
    xmlns:xp20="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.Xpath20"
    xmlns:bpws="http://schemas.xmlsoap.org/ws/2003/03/business-process/"
    xmlns:ns4="http://xmlns.oracle.com/pcbpel/adapter/file/FileReadSync/"
    xmlns:ids="http://xmlns.oracle.com/bpel/services/IdentityService/xpath"
    xmlns:ldap="http://schemas.oracle.com/xpath/extension/ldap"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema"
    xmlns:client="http://xmlns.oracle.com/ReadXMLFile2"
    xmlns:ora="http://schemas.oracle.com/xpath/extension"
    xmlns:hwf="http://xmlns.oracle.com/bpel/workflow/xpath"
    xmlns:ns1="http://xmlns.oracle.com/pcbpel/adapter/file/ReadFile/"
    xmlns:ehdr="http://www.oracle.com/XSL/Transform/java/oracle.tip.esb.server.headers.ESBHeaderFunctions"
    xmlns:ns3="http://www.example.org"
    xmlns:ns2="http://xmlns.oracle.com/pcbpel/adapter/file/WriteFile/"
    xmlns:bpelx="http://schemas.oracle.com/bpel/extension"
    xmlns:orcl="http://www.oracle.com/XSL/Transform/java/oracle.tip.pc.services.functions.ExtFunc">
    <!--
    PARTNERLINKS
    List of services participating in this BPEL process
    -->
    <partnerLinks>
    <partnerLink myRole="Read_role" name="ReadFile"
    partnerLinkType="ns1:Read_plt"/>
    <partnerLink name="WriteFile" partnerRole="Write_role"
    partnerLinkType="ns2:Write_plt"/>
    </partnerLinks>
    <variables>
    <variable name="Receive_1_Read_InputVariable"
    messageType="ns1:readfile_msg"/>
    <variable name="Invoke_1_Write_InputVariable"
    messageType="ns2:readfile_msg"/>
    </variables>
    <!--
    VARIABLES
    List of messages and XML documents used within this BPEL process
    -->
    <!--
    ORCHESTRATION LOGIC
    Set of activities coordinating the flow of messages across the
    services integrated within this business process
    -->
    <sequence name="main">
    <receive name="Receive_1" partnerLink="ReadFile" portType="ns1:Read_ptt"
    operation="Read" variable="Receive_1_Read_InputVariable"
    createInstance="yes"/>
    <assign name="Transform_1">
    <bpelx:annotation>
    <bpelx:pattern>transformation</bpelx:pattern>
    </bpelx:annotation>
    <copy>
    <from expression="ora:processXSLT('Transformation_3.xsl',bpws:getVariableData('Receive_1_Read_InputVariable','readfile'))"/>
    <to variable="Invoke_1_Write_InputVariable" part="readfile"/>
    </copy>
    </assign>
    <invoke name="Invoke_1" partnerLink="WriteFile" portType="ns2:Write_ptt"
    operation="Write" inputVariable="Invoke_1_Write_InputVariable"/>
    </sequence>
    </process>

  • How to transform data while populating an external table

    Hi Folks,
    I have to populate the first four fields of my external table from the data in a csv file. While the last field should have the "SYSDATE". My code to create an external table (and to populate the first four fields) is as follows
    create table T_XT
    (PNODE VARCHAR2(10),
    NODE VARCHAR2(15),
    SUB_NODE VARCHAR2(12),
    NODE_COUNT NUMBER,
    CREATE_DATE DATE
    organization external
    (type oracle_loader
    default DIRECTORY topology
    access parameters
    (records delimited by NEWLINE
    characterset US7ASCII
    badfile 'TOPOLOGY':'Modem_Count.bad'
    discardfile 'TOPOLOGY':'Modem_Count.dis'
    logfile 'TOPOLOGY':'Modem_Count.log'
    fields terminated by ','
    optionally enclosed by "'"
    location ('Modem_Count.csv')
    reject limit unlimited parallel;
    Can somebody please let me know where this data transformation (for the last field) should be specified ? And the syntax to do so.
    Thanks in advance
    rogers42

    Hi,
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6790030213850#59876418675556
    You'd rather just:
    select pnode,
           node,
           sub_node,
           node_count,
           sysdate
    from   t_xt;

  • How to transform data

    Hi,
    My Data extraction scenario is
    In need ORG ID, JOB ID & Staff ID
    I will get all these info from OBJ ID and when restricted to certain type 0TYPE it will get each value.
    My requirement is while data loading can we load all the Org ID’s to 0Org Unit. JOB id to 0JOB &
    Instead of doing it in reporting
    If so how can I do that
    Thanks

    Hi,
    You can do this the the update / transfer routine.
    Add ORG ID, JOB ID & Staff ID in your data target.
    then write the below routines for the below.
    ORGID:
    select ORGID from /bi0/mOBJ ID into RESULT where
    OBJID = comm_structure-OBJID
    and TYPE = <your your required one here>
    and OBJVERS = 'A'.
    do the same for others as well..

  • How to transform data in the second cube

    Hi,
    I have been given this scenario where I have to load from one cube into another cube.
    The data in first cube is in three currencies
    LC      GC      TC
    100     200     300
    Now when I load this into second cube, I want to insert currency type infoobject and get the result as
    Currencytype          Amount
    10                     100
    20                     200
    30                     300
    I would greatly appreciate if you can give me your thoughts on how to do this.Looking forward to your inputs.
    Regards,
    Kal

    Hi Kal,
    You have to options:
    1. To have 3 KFs and 3 currency key accordingly.
    2. You have 3 KFs only in the first cube. Knowing in which currency every KF is nominated you can easily determine a currency type and write it into a separate infoobject (there is business content infoobject 0CURTYPE).
    In the 2nd cube you'll have a currency type IO and one KF for amounts.
    In the start routine of the update rules you'll have to triple every record.
    You’ll map your amount KF in the 2nd cube with, for example KF in Group currency. The routine will add two more records for the other currencies.
    It would be easier to have in the 1st cube 1 KF and currency type. In this case you wouldn’t need any routines and mapping would be 1:1.
    However, all this ways have a drawack. – If you need to report on some particular currency, you don’t have this info in the cube.
    And as wrote in the previous posts, group currency may differ from group to group. Would you like to mix apples and oranges?
    And finally, the generally used approach for such tasks is to have 3 KFs and 3 currency key chars:
    Re: Unit and Currency infoobject
    Best regards,
    Eugene

  • How to transform data from hierarchy table to taxonomy table?

    Hi Friends
    Regarding my issue:
    1) I am having data in the hierarchy table which contains two fields(i.e. code, description) , now we have created another taxonomy table which contains two fields(i.e. code,description) same as hierarchy table.
    2) hierarchy table only contains data, which we have imported data into hierarchy table using import manager. we don't have data in the taxonomy table.
    3) now my issue is,we have to move that hierachical data in the hierarchy table to taxonomy without using import manager, only through the data manager using validations and assignments.
    4) If any one have solution to this issue, please help me out.
    Thanks in Advance
    bharat.chinthapatla

    Hi Ganesh,
    Thanks for your reply,
    I need to move data from hierarchy table to taxonomy table using validations and assignments.
    Thanks in Advance
    bharat.chinthapatla

  • How to transformed data from File to DB but the data to be insert two tables  parallel?

    Hi friends ,I am new to Oracle SOA ,I have one requirement  for the project
    actually in these project
    1.I have to read three different types of files (Employee info,Purchase Order Info,Salaes Order Info) through file adapter these records are to be store in  data base 
    transaction table.In these transaction table contains WHO columns and payload  with clob data type which contains each  record  in xml form.
    2. Now I have to  take these payload data  which is in xml form and store   in the respective table(employee or PO or SO) .I have to call these in dynamically.
    so please give me ideas and suggesstions ,,
    thanks in advance

    Roger,
    I don't know about simple way to load data from file into a table using button from your application,
    Maybe some piece of PL/SQL code with UTL_FILE or DBMS_LOB, or you can try to upload file to database using Apex "File browse" item, and then do some processing with DBMS_LOB... but it will require some work...
    Maybe someone else could advice with some simple method I don't know...
    However loading files using Apex Load Utility is quite easy.
    You can use something like this to load data to views created on remote tables:
    CREATE TABLE local_tmp_table AS SELECT * FROM view_on_remote_table WHERE 1=0;
    --load data from file to created table using Apex Load utility
    INSERT INTO view_on_remote_table SELECT * FROM local_tmp_table;
    COMMIT;
    DROP local_tmp_table;Or without views:
    CREATE TABLE local_tmp_table AS SELECT * FROM remote_table@dblink WHERE 1=0;
    --load data from file to created table using Apex Load utility
    INSERT INTO remote_table@dblink SELECT * FROM local_tmp_table;
    DROP local_tmp_table;Regards
    Tomasz
    Message was edited by:
    Tomasz K.

  • How to load data from PSA to CUBE & DSO at a time using DTP in BI 7 ?

    HI all,
    I am new to BI 7 . How to load the data at same time to DSO & INFO CUBE using DTP.
    Please provide me steps to load & plz specify which update mode I have to use ( FULL OR DELTA ) which one is best.
    Plz Suggest me.
    Thanks & Regards,
    Kiran m.
    Message was edited by:
            kiran manyam

    Below are the basic steps which we follow in any BI 2004S system:
    1)Create datasource. Here u can set/check the Soucre System fields.
    2)Create Transformation for that datasource. (no more update rules/transfer rules)
    2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
    DataSource -> Transformation -> Data Target
    Now if you want to load data into data target from Source System Datasource:
    1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
    2)Now Create DTP (Data Transfer Process) for that data source.
    3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
    If you are loading data from one one data target to other, no need to use PSA, you can directly execute DTP in that case.
    Data Source -> Transformation (IP/DTP) -> Data Target1 -> DTP ->Data Target 2
    Use the below link for detailed example:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/fc61e12d-0a01-0010-2883-e2fc63ef729b
    Infosources are no more mandatory with BI 7.0, below is the link to scenarios where we use infosources:
    http://help.sap.com/saphelp_nw04s/helpdata/en/44/0243dd8ae1603ae10000000a1553f6/content.htm
    Full or delta depends on your requirement...
    chk the below thread to know better
    difference between the various loads
    hope it helps
    Message was edited by:
            sriram viswanathan

  • How to move data from a staging table to three entity tables #2

    Environment: SQL Server 2008 R2
    I have a few questions:
    How would I prevent duplicate records, when/ IF SSIS is executed many times?
    How would I know that all huge volume of data being loaded in the entity tables?
    In reference to "how to move data from a staging table to three entity tables ", since I am loading large volume of data, while using lookup transformation:
    which of the merge components is best suited.
    How to configure merge component correctly. (screen shot is preferred) 
    Please refer to the following link
    http://social.msdn.microsoft.com/Forums/en-US/5f2128c8-3ddd-4455-9076-05fa1902a62a/how-to-move-data-from-a-staging-table-to-three-entity-tables?forum=sqlintegrationservices

    You can use RowCount transformation in the path where you want to capture record details. Then inside rowcount transformation pass a integer variable to get count value inside
    the event handler can be configured as below
    Inside Execute SQL task add INSERT statement to add rowcount to your audit table
    Can you also show me how to Check against destination table using key columns inside a lookup task and insert only non
    matched records (No Match output)
    This is explained clearly in below link which Arthur posted
    http://www.sqlis.com/sqlis/post/Get-all-from-Table-A-that-isnt-in-Table-B.aspx
    For large data I would prefer doing this in T-SQL. So what you could do is dump data to staging table and then apply
    T-SQL MERGE between tables (or even a combination of INSERT/UPDATE statements)
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • How to load data from a query?

    Hi all,
    I'm not very familiar with ODI, but we already use it to load table (to Hyperion Planning).
    Now, we need to execute a query (I don't want only to apply filter), that cross differents table, to load it. We can't create view in the datawarehouse.
    How can I do this? What is your recommendation?
    Thanks for your help

    Yes, this is possible.
    Start with a single query that returns you the data for all lecturers.
    Right-click on the "all lecturers" query in the queries pane, and choose Reference, once for each lecturer. This will create a query for each lecturer.
    Open the query for each lecturer and filter the data so that only that lecturer's results are visible.
    Click "Close & Load To..." for each lecturer's query  to load the data into the data model. This will create a data model table for each lecturer.
    If your question is more about how to transform such a survey list into a table that can be easily filtered, please provide an example of how the list shows up in Power Query.
    Ehren

  • I need how to load data from MS Excel(csv format) to NW Excel

    Hi,
    I am doing Migration project from SAP MS to NW Manually.
    I need how to load data from MS excel (csv format) to NW excel .
    For example 2008 budget data.
    Could you please help me in this.
    Thanks and Regards
    Krishna

    Hi,
    You need to create a transformation file and a conversion file if required. First upload the excel (csv) file into BPC using Manage Data and Upload File option.
    Create the transformation file (refer to the sap help  on how to define a transformation file). You need to specify the mapping correctly and include all your application dimensions and map them to appropriate columns of the flat file.
    Before running the import package, do validate the data in the flat file you uploaded into BPC with the transformation file you created.
    Thanks,
    Sreeni

  • How to view data into Infocube

    Good Morning everybody.
    I would like to know if somebody have any document with informations about RSA1, as for example "How to create a Back Up Infocube?" "How to view data into infocube?"
    If you can send me
    dacampos at br.ibm.com
    Thanks a Lot
    Daniel Campos - IBM Brazil.

    Hi,
       Backup Cube is used for saving the data of Planning Area.
    Following steps for creating Backup Infocube
    1.First Create Data Source thorugh Planning Area. Double click on PA,then go to Extras where u find option for creating Data Source,then replicate the cretaed Data Source.
    2.Go to RSA1,Activate Created Data Source & Create Info Package uder the Data Source.
    3.Create Info Cube under Info Area.
    4.Create Transformation between Data Source & info Cube.
    5.Activate Data Transfer Process(DTP).
    For viewing the data into infocube
    1.Right Click on selected Info Cube
    2.Go to Manage
    3.Select request & above info Cube Type(in Selectable data Target for Administration) & go to Contents
    4.Select require field from Fld Selection for Output.
    5.Come back & Increase no. of hits if require.
    6.Execute it. You will find out the data in selected field.
    Hope so it will be helpful for you.
    Regards
    Sujay

  • [APD]: How to transfer data to SAP CRM?.

    Dear Expert,
    1. Do you know how to transfer data for APD to SAP CRM ??
    2. The one that i know that we can transfer attribute of Business Partner (0BPARTNER) to CRM.
    Can we transfer the data beyond attribute of 0BPARTNER? For example, data from ODS to ZTABLE on SAP CRM..
    I really need your guidance & suggestion.
    Many thanks all for your attention,
    Best regards,
    Niel.

    Hi,
    <i>1. EEWB in this case is used to create tables in CRM Systems. So once i've made it (the tables), I can see that tables on BW Systems (particularly when i'm creating transformation).</i>
    Yes you are correct.
    <i>
    2. The procedure that i should make sequently :
    a. [CRM Systems]: Create z-tables through tcode EEWB.
    Btw, do i make tables throught this t-code either in BW System & CRM System ?
    If yes, Should the project name in EEWB for both system has the similar name?
    b. [BW Systems]: Create APD.
    c. [BW Systems]: Create transformation in APD for mapping where you'll see the table created in CRM System.</i>
    a) correct
    You will make tables in CRM system.After the extension creation you can check it in the t-code SE12 in CRM system.
    b)by project name you mean the project you assign to the TR's????
    then it depends upon your system.SInce these are two different systems therfore you should create two different projects in the two systems and assign the projects accordingly.
    c) correct
    <i>An for others,
    i made project in EEWB then i choose the business object is ADS (Analytical Data Storage) .. => Do i correct in choosing the business object there?</i>
    yes.
    Yes this is I think a bug.I faced the same issue many times ... what you can do is try to deselect all the key figures and again select the keys.....this will help sometimes....after trying 2-3 times I think you will learn a way to come out of it:)
    In future need my help ...you are welcome.
    Thanks

  • How to load data to content cube 0BCS_C21?

    Hi experts,
    I wonder how to upload data to BI content infocube 0BCS_C21? Is there any content datasource for that? I cannot find any content datasource, infosource, transformation, ...
    Thank you,
    Michal

    Hi,
    You can find some more information over here:
    http://help.sap.com/saphelp_nw70/helpdata/en/cf/e9376e1a2c43acafb6023b6d631b92/frameset.htm
    Apparently it is getting its date from Cube 0BCS_C11.
    Regards,
    Dave

  • How to load data from r3 to PSA

    hellow gurus
    m new in BI  7.0
    i m workin in Standard cube 0rt_c05
    wid data source 2lis_02_scl..
    i have replicated the datasource from r3 to BW
    now i wanaa know . how to push data from R3 to BW . till PSA level and then to Cube,..,
    thanking you
    points will be assigned as my gesture for efforts
    Regards
    Rahul

    hi,
    data loading in BI 7.0 from fla file extraction.
    fisrt create one Cube or DSO with the same structure which you have in flatfile..
    and activate it..
    ->now comes to Datasource tab> create one Datasource here you need to select type of data for example.. select Transactional data --> and menntion your flatfile name in extraction tab- and file type and eneter your info object names in FIELDS tab --> and load preview data Activate it..
    now select your datasource and create info package and schedule it.. now your data will loded in to PSA level...
    > and now comes to info provider select your cube.. and right clcik it.. and create transformations.,. and activate it..
    > and create DTP -- Activate it.. and Execute it..
    1)Create datasource. Here u can set/check the Soucre System fields.
    2)Create Transformation for that datasource. (no more update rules/transfer rules)
    2.1) While creating transformation for DS it will ask you for data target name, so just assign where u want to update ur data.
    DataSource -> Transformation -> (DTP)-->Data Target
    Now if you want to load data into data target from Source System Datasource:
    1) Create infopackage for that data source. If you are creating infopackage for new datasources, it will only allow you update upto PSA, all other options u can see as disabled.
    2)Now Create DTP (Data Transfer Process) for that data source.
    3) NOw schdule the Infopackage, once the data is loaded to PSA, you can execute your DTP which will load data to data target.
    Data Transfer Process (DTP) is now used to load data using the dataflow created by the Transformation. Here's how the DTP data load works:
    1) Load InfoPackage
    2) Data gets loaded into PSA (hence why PSA only is selected)
    3) DTP gets "executed"
    4) Data gets loaded from PSA into the data target once the DTP has executed
    Ramesh

Maybe you are looking for

  • How do I get usb superdrive to work with new macbook pro. superdrive rejects cd ?

    Just purchased new system. (never had an apple before) so I am all at sea. followed the instructions to uodate the system. whenI inserted a cd with program to be installes foe a Mac, it was ejected. inserted a cd with 'jpeg files only on it it too wa

  • Iphoto/library... my photo's have disappeared, but i see my complete file in my screen saver

    My complete file in in my screen saver but I don't know how to get them back into iphoto. when I open i photo it says that it may be in use somewhere else.. which must be my screen saver... how can I get them back? I did try to rebuild and I have shu

  • Formatting a 4GB Verbatim Drive?

    hey everyone. just had a quick question. I have a verbatim U3 4gb external drive, just a little pocket one to carry on me when i go from class to class. i searched and found a way to format it in disk utility. but i was unsure it was right, because t

  • UML for java developer

    Hi can any one tell me the free UML tools thank you

  • Page not adjusting to browser size

    I've had to edit this as my problem has slightly changed. The website I am working on is here Whenever I adjust the size of my browser window the content (the box in the middle) stays fixed and won't adjust to the broswer size. I know it's possible t