Data loading: formatting data for timestamp column

Hi All,
I have a table with a timestamp column named as created_date. I want to upload data to that table using data loading page. but there is one problem while uploading data, I have a csv file in which the created_date column data in two different format as follows ,
09/03/2013 03:33am
09/02/2013 03:24pm
the above data throws an error ORA-01821: date format not recognized.
In Data / Table Mapping page, I tried with MM/DD/YYYY HH12:MI:SS AM. What format should i use for am and pm??
Please help me to solve....
Thanks in advance
Lakshmi

I solved by using the format MM/DD/YYYY HH:MIAM.
Thanks
Lakshmi

Similar Messages

  • R12 Discrete Mfg Data Loader Formats

    Hi All,
    Can Any one share the R12 Discrete Manufacturing Data loader formats in order to load the Bills and Routings.
    If any one has the above mentioned, please forward to my mail id: [email protected]
    regards and thanks in advance,
    VAS

    Mugunthan
    Yes we have applied 11i.AZ.H.2. I am getting several errors still that we trying to resolve
    One of them is
    ===========>>>
    Uploading snapshot to central instance failed, with 3 different messages
    Error: An invalid status '-1' was passed to fnd_concurrent.set_completion_status. The valid statuses are: 'NORMAL', 'WARNING', 'ERROR'FND     at oracle.apps.az.r12.util.XmlTransmorpher.<init>(XmlTransmorpher.java:301)
         at oracle.apps.az.r12.extractor.cpserver.APIExtractor.insertGenericSelectionSet(APIExtractor.java:231)
    please assist.
    regards
    girish

  • Latest PowerQuery issues with data load to data models built with older version + issue when query is changed

    We have a tool built in excel + Powerquery version 2.18.3874.242 - 32 Bit (No PowerPivot) using data load to data model (not to workbook). There are data filters linked to excel cells, inserted in OData query before data is pulled.
    The Excel tool uses organisational credentials to authenticate.
    System config: Win 8.1, Office 2013 (32 bit)
    The tool runs for all users as long as they do not upgrade to PowerQuery_2.20.3945.242 (32-bit).
    Once upgraded users can no longer get the data to load to the Model. Data still loads to the Workbook but the model breaks down. Resetting load to data model erases all measures.
    Here are the exact errors users get:
    1. [DataSource.Error] Cannot parse OData response result. Error: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.
    2. The Data Model table could not be refreshed: There isn't enough memory to complete this action. Try using less data or closing other applications. To increase memory available, consider ......

    Hi Nitin,
    Is this still an issue? If so, can you kindly provide the details that Hadeel has asked for?
    Regards,
    Michael Amadi
    Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to vote it as helpful :)
    Website: http://www.nimblelearn.com, Twitter:
    @nimblelearn

  • How to rectify the errors in master data loads & transactional data loads?

    hy,
    please any one tell me
    How to rectify the errors in master data loads & transactional data loads?
    thnQ
    Ravi

    Hi,
    Please post specific questions in the forum.
    Please explain the error you are getting.
    -Vikram

  • XML Publisher Issue, Unable to see Data-- Load XML Data Option.

    Hi ,
    Can any one help to solve this issue.
    Iam using XML Publisher tool to develop Oracle reports.
    while doing this iam unable to useXML Publisher.
    We have,
    XML Publisher 5.5 version,
    My problem is iam unable to see " *Microsoft word-->Data-->Load XML Data* " after installation of xml publisher succesfully.
    Every one in my team are working with the same versions.but in my machine iam unable to get this option.
    Thanks-
    Sowmya.

    Hi Sairam,
    Check the checkbox Convert database Null values to default and Convert other Null values to default under Edit -> Report Options and then refresh the report.
    If this does not work you can contact SAP Support.
    - Nrupal

  • Getting Error When I click on Data -- Load XML Data (XMLP Desktop 5.6.2)

    Hi All,
    I'm getting an error "Compile error in hidden module: Module_registry" when i click on Data -->Load XML Data on MSword. I have installed XMLP Desktop 5.6.2 successfuly. Can anybody help me..?
    Regards,
    Aanta

    Can you check the eventviewer on your desktop if there is an error logged against any DLL in application log (Start -- Run -- eventvwr) each time you hit this error. Also is this the first time you installed XMLP Desktop or did you have an older version?

  • Data Load function - max number of columns?

    Hello,
    I was able to successfully create a page with Data Load Wizard. I've used this page wizard before with no issues, however in this particular case I want to load a spreadsheet with a lot of columns (99 to be precise). When I run the page, it uploads the spreadsheet into the proper table, but only the first 45 columns. The remaining 56 columns are Null for all rows. Also, there are 100 rows in the spreadsheet and it doesn't load them all (it loads 39).
    Is there a limit to the number of columns it can handle?
    Also, when I re-upload the same file, the data load results show that it Inserted 0 rows, Updated 100 rows, Failed 0 zeros. However there are still only a total of 39 rows in the table.
    Thoughts?
    Steve

    Steve wrote:
    FYI, I figured out why it wasn't loading all 100 rows. Basically I needed to set two dependent columns in the load definition instead of one; it was seeing multiple rows in the spreadsheet and assumed some were the same record based on one column. So that part is solved...
    I still would like feedback on the number of columns the Data Load Wizard can handle, and if there's way to handle more than 45.The Data Load Wizard can handle a maximum of 46 columns: +{message:id=10107069}+

  • Master Data Loading Format Error

    Dear Gurus,
       I want to load master data text for infoobject 0PROJECT. However, after loading, all the data from the source system have been changed. For example, a project named "A-CN0003" in the R/3 system has been changed to "ACN0003000000" after loading to infoobject of BW system. As you can see, "-" has been removed and several "0" have been attached at the end. I checked the PSA and found all the project definations have the same format like "ACN0003000000", but in RSA3 of R/3 system remain the format like "A-CN0003".
      How can I do to make the project defination format like "A-CN0003"?
      Thanks in advance.
      Jin Ming

    HI
    double click on 0project > general tab>conversion routine
    is there any routine there if yes remove it.
    delete the data and reload the data again
    hope this helps

  • Second BW data load job waits for first to complete.

    Hi all,
    Iam manually loading data into my BW system , i have observed that if i try to load two data source together the system wait fror one to complete then only it will trigger the secnond load. eg if i run 2lis_11_vahdr and 2lis_13_Vdhdr together.
    The RSMo shows data records for 11_vahdr but for 13_vdhdr it is zero untill the the 11_vahdr finishes.
    I have 8 background processes are available both in Bw and R/3
    Please help.
    Regards,

    Hi,
    the case may not be like that...if one finishes then only the other one starts....
    When u start those 2 loads observe the background process in R/3 which starts with job name:BI_REQ*
    As per my guesses bcoz of ur 8 background processes when u start 2 loads at a time there will be 2 jobs trigerring in R/3 with job name given above....
    One job may be going into RELEASED state in R/3 and waits for the first  job to finish and after that finishes the previous one comes into ACTIVE state and start generating idoc's...
    by this we can say that this may be due to resorces avialability in R/3...
    u closely mnitor the jobs in R/3....
    rgds,
    nkr.

  • BPC NW 7.0: Data Load: rejected entries for ENTITY member

    Hi,
    when trying to load data from a BW info provider into BPC (using UJD_TEST_PACKAGE & process chain scheduling), a number of records is being rejected due to missing member entries for the ENTITY dimension in the application.
    However, the ENTITY member actually do exist in the application. Also, the dimension is processed with no errors. The dimension member are also visible usnig the Excel Client naviagtion pane for selecting members.
    The error also appears when kicking of the data load from the Excel Client for BPC. Any ideas how to analyze this further or resolve this?
    Thanks,
    Claudia Elsner

    Jeffrey,
    this question is closely related to the issue, because there is also a short dump when trying to load the data into BPC. I am not sure whether both problems are directly related though:
    Short dump with UJD_TEST_PACKAGE
    Problem desription of the post:
    When running UJD_TEST_PACKAGE, I get a short dump.
    TSV_TNEW_PAGE_ALLOC_FAILED
    No more storage space available for extending an internal table.
    Other keywords are CL_SHM_AREA and ATTACHUPDATE70.
    When I looked at NOTES, I found this Note 928044 - BI lock server". Looking at the note and debugging UJD_TEST_PACKAGE leaves me some questions:
    1. Do I need a BI lock server?
    2. Should I change the enque/table_size setting be increased on the central instance from 10000 to 25000 or larger?
    Claudia

  • ConsoleOne lacking Date/Time-Format-Tab for Client-Setting

    I remember from GW6 the Client-Setting-Tab in ConsoleOne for Date-/Time-Format (for local german format)
    There is a tab with these settings in the GW8-Client, but I cannot find this specific setting in ConsoleOne.
    Any ideas?
    Sincerely
    Karl

    The availability in ConsoleOne with GW6.0 is confirmed. Now, since it seems to be gone with GW8.0, how would someone deal with the need of a central adjustment of this setting? At time I am running through the firm with setting this at the users workplace in the GW8.0-Client under Tools/Options/Calendar/Date-Time with hitting the button "Set to System" (or something like that - I have it in German).
    Sincerely
    Karl
    Originally Posted by laurabuckley
    Hi Karl,
    Personally, I don't recall such a setting in ConsoleOne.
    Sorry, not much use, just confirmation that you are not doing something wrong!
    Cheers,

  • Open HUB ( SAP BW ) to SAP HANA through DB Connection data loading , Delete data from table option is not working Please help any one from this forum

    Issue:
    I have SAP BW system and SAP HANA System
    SAP BW to SAP HANA connecting through a DB Connection (named HANA)
    Whenever I created any Open Hub as Destination like DB Table with the help of DB Connection, table will be created at HANA Schema level ( L_F50800_D )
    Executed the Open Hub service without checking DELETING Data from table option
    Data loaded with 16 Records from BW to HANA same
    Second time again executed from BW to HANA now 32 records came ( it is going to append )
    Executed the Open Hub service with checking DELETING Data from table option
    Now am getting short Dump DBIF_RSQL_TABLE_KNOWN getting
    If checking in SAP BW system tio SAP BW system it is working fine ..
    will this option supports through DB Connection or not ?
    Please follow the attachemnet along with this discussion and help me to resolve how ?
    From
    Santhosh Kumar

    Hi Ramanjaneyulu ,
    First of all thanks for the reply ,
    Here the issue is At OH level ( Definition Level - DESTINATION TAB and FIELD DEFINITION )
    in that there is check box i have selected already that is what my issue even though selected also
    not performing the deletion from target level .
    SAP BW - to SAP HANA via DBC connection
    1. first time from BW suppose 16 records - Dtp Executed -loaded up to HANA - 16 same
    2. second time again executed from BW - now hana side appaended means 16+16 = 32
    3. so that i used to select the check box at OH level like Deleting data from table
    4. Now excuted the DTP it throws an Short Dump - DBIF_RSQL_TABLE_KNOWN
    Now please tell me how to resolve this ? will this option is applicable for HANA mean to say like , deleting data from table option ...
    Thanks
    Santhosh Kumar

  • Data loading into Data warehouse Tables

    Hi all,
    I am having scenario like this. I am having table with lacks of records. Every day, new data will come with some records DELETE ed, some new records
    INSERT ed, some records UPDATE ed. I have to load this new data into parent table.
    Currently, we are DELETING all parent data and inserting new data. It is taking a lot of time. There is NO PRIMARY KEY on this parent
    table.
    What are the best options to load the new data. Actually, there are several tables are there. I have mentioned one table for example.
    Oracle version is 9i.
    Thanks,
    Pal

    I need to know which is a better way of loading flat
    files data into Warehouse tables. SQL*Loader or
    Oracle DataMart Suite. Doesn't make much odds. The advantage of using the
    Delimited or Fixed Length transform is that it keeps
    everything within Data Mart Suite.
    1. What all types of databases are supported by DataMart
    ODBC. DataMart Suite will support most RDBMS's as an
    ODBC source. Sybase, SQL Server, Informix, DB2, Redbrick,
    Teradata. Note this is only a source for the
    data, not a store for the DataMart
    2. Can DataMart suite be programmed for auto operation of
    loading data.There is some limited scheduling functionality based on
    the NT scheduler service. Where you want to run one plan
    following another, you may need to get a bit clever, by
    including some preprocessing to check that the previous
    plan has completed or use a third party scheduler which
    would not start the plan until the previous plan had
    completed.
    3. What is the extend of errors handling in loading operation,
    supported by this suite. Pretty good. All the stuff from SQL Loader, plus each of the
    transforms has it's own error handling.
    ===
    Manish Jain (guest) wrote:
    : Hello,
    : I need to know which is a better way of loading flat
    : files data into
    : Warehouse tables. SQL*Loader or Oracle DataMart Suite.
    : 1. What all types of databases are supported by DataMart ODBC.
    : 2. Can DataMart suite be programmed for auto operation of
    : loading data.
    : 3. What is the extend of errors handling in loading operation,
    : supported by this suite.
    : regards,
    : Manish.
    Oracle Technology Network
    http://technet.oracle.com
    null

  • Download Formatting Mask for BLOB column

    Hi All,
    IR Query : select "ROWID","FILE_NAME",dbms_lob.getlength("DATA_FILE") "DATA_FILE",
    "DOC_SIZE",PROPOSAL_OWNER
    from "#OWNER#"."APXTER_IMP_HDR"
    I am using BLOB Download Format Mask on FILE_NAME column
    using the following FORMATTING
    DOWNLOAD:APXTER_IMP_HDR:DATA_FILE:ROWID::::::attachment:Download
    After applying the changes, I counter the error
    The number of display columns in the report reached the limit. Please click Select Columns under Actions menu to minimize the report display column list.
    Thanks,
    Sombit

    Hello,
    Here you can find a tutorial about blob:
    http://www.oracle.com/webfolder/technetwork/tutorials/obe/db/apex/r31/apex31nf/apex31blob.htm
    I have a demo here - it shows how to display an image in case the file is image. Otherwise, it displays download link:
    http://apex.oracle.com/pls/apex/f?p=63066:1
    workspace: somefeto
    user: test
    pwd: test
    Application 63066
    Also, there is a packaged application : Sample File Upload and Download that shows you, how to do it.....
    you can find it on your workspace.. or here if you wish:
    http://apex.oracle.com/pls/apex/f?p=10540:3:15582111009947::NO
    workspace: somefeto
    user: test
    pwd: test
    Regards,
    Fateh
    If you believe that my answer was helpful to you or correct, then please mark the answer as helpful or correct ...

  • Modify DATE format only for a column

    DB table has two or more columns of DATE type of which a column representing "Date of Birth" should not be of the format of TIMESTAMP . When queried it should return "mm/dd/yyyy" format without 'hh:mm:ss' . The others should be as usual.
    This is to ensure no changes in the applications querying this columns
    Any hints appreciated
    Thanks

    It depends on NLS_DATE_FORMAT and NLS_TIMESTAMP_FORMAT that is set in the database or session and the datatye used in the table, here is an example:-
    U1@BABU>  CREATE TABLE TEST_TAB (DATE1 DATE, DATE2 TIMESTAMP);
    Table created.
    U1@BABU> INSERT INTO TEST_TAB VALUES (SYSDATE,SYSDATE);
    1 row created.
    U1@BABU>  COMMIT;
    Commit complete.
    U1@BABU> SELECT * FROM TEST_TAB;
    DATE1 DATE2
    22-DEC-06 22-DEC-06 11.44.18.000000 AM
    U1@BABU> SELECT * FROM NLS_SESSION_PARAMETERS;
    PARAMETER                      VALUE
    NLS_LANGUAGE                   AMERICAN
    NLS_TERRITORY                  AMERICA
    NLS_CURRENCY                   $
    NLS_ISO_CURRENCY               AMERICA
    NLS_NUMERIC_CHARACTERS         .,
    NLS_CALENDAR                   GREGORIAN
    NLS_DATE_FORMAT DD-MON-RR
    NLS_DATE_LANGUAGE              AMERICAN
    NLS_SORT                       BINARY
    NLS_TIME_FORMAT                HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT             HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT        DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY              $
    NLS_COMP                       BINARY
    NLS_LENGTH_SEMANTICS           BYTE
    NLS_NCHAR_CONV_EXCP            FALSE
    17 rows selected.
    U1@BABU> ALTER SESSION SET NLS_DATE_FORMAT='DD-MON-YYYY HH24:MI:SS';
    Session altered.
    U1@BABU> SELECT * FROM TEST_TAB;
    DATE1 DATE2
    22-DEC-2006 11:44:18 22-DEC-06 11.44.18.000000 AM
    U1@BABU> SELECT TO_CHAR(DATE1,'MON-DD-YYYY HH24:MI:SS') DATE1, TO_CHAR(DATE2,'Month DD YYYY') DATE2
    FROM TEST_TAB;
    DATE1                DATE2
    DEC-22-2006 11:44:18 December  22 2006

Maybe you are looking for

  • Shared home folder for OSX and Windows

    Hello, I am wondering if it is possible to have a shared home folder for Windows and OSX. We have an AD and ODS. Is this possible, if it is, using CIFS? Hope hearing from anyone Best Regards.

  • PGA memory leak detected

    Hi all, I using Database version 9.0.2.1 on Windows. In Udump\.....trace file have a message: *** 2004-04-11 13:28:58.000 *** SESSION ID:(4.1) 2004-04-11 13:28:57.000 ******** ERROR: PGA memory leak detected 10340 > 6004 ******** HEAP DUMP heap name=

  • 10.4.6 Causes photobooth Flicker

    Just posted further down the page. I just installed 10.4.6 and when I launch photo booth the camera flickers like it is adjusting to the light--never used to do that at all. I also have the effects page slow down to slow motion when I click on it. **

  • GR for Sub contract material

    hi guys 1.Sub Con PO craeted 2.MB1B 541 goods transferd with ref to PO 3. J1if01 with ref to Mat doc 4. GR MIGO  ref to PO but am not getting any extra tabs for entering sub con challan 1.even i mentained settings or excise group create EI tab i sele

  • Flash forces install of McAfee

    I go to: http://www.adobe.com/support/flashplayer/downloads.html I click on: Get the latest version Download the most recent version of Adobe Flash Player It DOES NOT give me an option to deselect McAfee This give me: Adobe Flash Player 11.8.800.94 (