Importing Date fields from Oracle always defaults to DT_DBDATETIME, and won't accept chages
Hi
I am trying to import date fields in an Oracle database in to SQL Server 2012, using SSIS. The package is using DT_DBTIMESTAMP type in the Data Flow, but this has the same range as SQL Server datetime, and it bombs out all the time.
I have tried to change the type used to DT_DBDATE, but no matter what I do the data type on the input columns is readonly and will not change.
Any help would be much appreciated
Andy
CRM 4, SQL Server and .Net developer using C#
Hi Andy,
From the Mapping of Integration Services Data Types to Database Data Types section of the document, we can get the detailed information about mapping the Oracle data types to Integration Services data types. To change the data type from DT_DBTIMESTAMP to
DT_DBDATE, we can use the following two methods:
Method 1:
Right click on the Source component and click “Show Advanced Editor”.
Switch to the “Input and Output Properties” tab, expand the XX Source Output node, and expand the Output Columns node.
Click the target column, and change the DataType property to DT_DBDATE.
Method 2:
Add a Data Conversion component to the Data Flow Task
Double click the Data Conversion component to open the Data Conversion Transformation Editor
Select the target input column, and change the Data Type to database date [DT_DBDATE].
If I have anything misunderstood, please feel free to let me know.
Regards,
Mike Yin
If you have any feedback on our support, please click
here
Mike Yin
TechNet Community Support
Similar Messages
-
SQL Developer Data Modeler Import Data Dictionary from Oracle 10g
Whenever I try to import even one table from the data dictionary, it never completes. Is there any known issue with this?
Hi Eric,
we don't have such problem reported. Do you use Oracle driver (it comes with product) or it's another driver - JDBC-ODBC or something else? Can you provide more info?
Best regards,
Philip -
Problem with date format from Oracle DB
Hi,
I am facing a problem with date fields from Oracle DB sources. The date format of the field in DB table is 'Date base type is DATE and DDIC type is DATS'.
I mapped the date fields to Date characters in BI. Now the data that comes to PSA is in weird format. It shows like -0.PR.09-A
I have changing the field settings in DataSource to internal and external and also i have tried mapping these date fields to text fields with out luck. All delivers the same format.
I have also tried using conversion routines like, CONVERSION_EXIT_IDATE_INPUT to change format. It also delivers me the same old result.
If anybody of you have any suggestions or if anybody have you experienced such probelms, Please share your experience with me.
Thanks in advance.
Regards
VaradaThanks for all your reply. I can only the solutions creating view in database. I want some solution to be done in BI. I appreciate if some of you have idea in it.
The issue again in detail
I am facing an issue with date fields from oracle data. The data that is sent from Oracle is in the format is -0.AR.04-M. I am able to convert this date in BI with conversion routine in BI into format 04-MAR-0.
The problem is, I am getting data of length 10 (Output format) in the format -0.AR.04-M where the month is not in numericals. Since it is in text it is taking one character spacing more.
I have tried in different ways to convert and increased the length in BI, the result is same. I am wondering if we can change the date format in database.
I am in puzzle with the this date format. I have checked other Oracle DB connections data for date fields in BI, they get data in the format 20.081.031 which will allow to convert this in BI. Only from the system i am trying creating a problem.
Regards
Varada -
How to load date field from a remote Oracle DB with DB Connect
Hi,
Does any one have experience extracting Date field from a remote Oracle database through DB Connect to BW? I am experiencing data format issue. The extracted data shows '04-Oct-0' for '10/4/05'. It works fine if I create a view in Oracle database to preformat the data to '20051004'. But, I am not allowed to create view in Remote DB.
Any suggestion is very appreciated.
Regards,
FrankYou have to change it to a varchar2 field. YYYYMMDD. I do not know of any other option.
-
I export dump from oracle database 8.1.7 and try to import in 10.2.0. while import its prompt error as follow
8.1.7
SQL> select * from nls_database_parameters where parameter like '%SET%';
PARAMETER VALUE
NLS_CHARACTERSET US7ASCII
NLS_NCHAR_CHARACTERSET US7ASCII
10.2.0
SQL> select * from nls_database_parameters where parameter like '%SET%';
PARAMETER VALUE
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_CHARACTERSET WE8MSWIN1252
C:\Documents and Settings\Administrator>imp system/manager file=H:\CGlByte\cglby
te_03-07-2013Wed.dmp fromuser=system touser=cglbyte ignore=y
Import: Release 10.2.0.4.0 - Production on Thu Jul 4 16:11:49 2013
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Produc
tion
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export file created by EXPORT:V08.01.07 via conventional path
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses WE8MSWIN1252 character set (possible charset conversion)
export client uses WE8ISO8859P1 character set (possible charset conversion)
export server uses US7ASCII NCHAR character set (possible ncharset conversion)
Import terminated successfully without warnings.
Pl suggest remedy
Regards,8.1.7
AMERICAN_AMERICA.WE8ISO8859P1
10.2.0
AMERICAN_AMERICA.WE8MSWIN1252
exp system/manager file=d:\cgl.dmp owner=cgl, even i set character set before export but no impact -
Error when reading BLOB field from Oracle usin Toplink
We experience a very annoying problem when trying to read a BLOB
field from Oracle 8.1.6.2.0 using TOPLink 3.6.3. I have attached the
exception stack trace that is reported to the console. As far as I can
judge a fault at oracle.sql.LobPlsqlUtil.plsql_length() happens first and
then at TOPLink.Private.DatabaseAccess.DatabasePlatform.convertObject().
The exception is permanently repeating that is very critical for us.
ServerSession(929808)--Connection(5625701)--SELECT LOBBODY, ID, LABEL, FK_OBJECT_ID, FK_OBJECTTYPE FROM NOTE WHERE (ID = 80020)
INTERNAL EXCEPTION STACK:
java.lang.NullPointerException
at oracle.sql.LobPlsqlUtil.plsql_length(LobPlsqlUtil.java:936)
at oracle.sql.LobPlsqlUtil.plsql_length(LobPlsqlUtil.java:102)
at oracle.jdbc.dbaccess.DBAccess.lobLength(DBAccess.java:709)
at oracle.sql.LobDBAccessImpl.length(LobDBAccessImpl.java:58)
at oracle.sql.BLOB.length(BLOB.java:71)
at TOPLink.Private.Helper.ConversionManager.convertObjectToByteArray(ConversionManager.java:309)
at TOPLink.Private.Helper.ConversionManager.convertObject(ConversionManager.java:166)
at TOPLink.Private.DatabaseAccess.DatabasePlatform.convertObject(DatabasePlatform.java:594)
at TOPLink.Public.Mappings.SerializedObjectMapping.getAttributeValue(SerializedObjectMapping.java:43)
at TOPLink.Public.Mappings.DirectToFieldMapping.valueFromRow(DirectToFieldMapping.java:490)
at TOPLink.Public.Mappings.DatabaseMapping.readFromRowIntoObject(DatabaseMapping.java:808)
at TOPLink.Private.Descriptors.ObjectBuilder.buildAttributesIntoObject(ObjectBuilder.java:173)
at TOPLink.Private.Descriptors.ObjectBuilder.buildObject(ObjectBuilder.java:325)
at TOPLink.Private.Descriptors.ObjectBuilder.buildObjectsInto(ObjectBuilder.java:373)
at TOPLink.Public.QueryFramework.ReadAllQuery.execute(ReadAllQuery.java:366)
at TOPLink.Public.QueryFramework.DatabaseQuery.execute(DatabaseQuery.java:406)
I have started the application with Oracle JDBC logging on and found that the problem may originate in a possible lack of syncronization in the pooled connection implementation:
DRVR FUNC OracleConnection.isClosed() returned false
DRVR OPER OracleConnection.close()
DRVR FUNC OracleConnection.prepareCall(sql)
DRVR DBG1 SQL: "begin ? := dbms_lob.getLength (?); end;"
DRVR FUNC DBError.throwSqlException(errNum=73, obj=null)
DRVR FUNC DBError.findMessage(errNum=73, obj=null)
DRVR FUNC DBError.throwSqlException(reason="Logical handle no longer valid",
SQLState=null, vendorCode=17073)
DRVR OPER OracleConnection.close()
so the prepareCall() is issued against an already closed connection and the
call fails.
I assume we have been using a JDBC 2.0 compliant driver. We tried out
drivers that Oracle supplies for 8.1.6, 8.1.7 versions. To be true I
couldn't find any information about the JDBC specification they conform to. Does it
mean that these drivers may not be 100%-compatible with JDBC 2.0 Spec?
How can I find out if they are 2.0 compliant?
Also I have downloaded Oracle 9.2.0.1 JDBC drivers. This seemed to work
fine until we found another incompatibility which made us return back to
8.1.7 driver:
UnitOfWork(7818028)--Connection(4434104)--INSERT INTO STATUSHISTORY (CHANGEDATE, FK_SET_STATUS_ID) VALUES ({ts '2002-10-17 16:46:54.529'}, 2)
INTERNAL EXCEPTION STACK:
java.sql.SQLException: ORA-00904: invalid column name
at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java:289)
at oracle.jdbc.ttc7.Oall7.receive(Oall7.java:573)
at oracle.jdbc.ttc7.TTC7Protocol.doOall7(TTC7Protocol.java:1891)
at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch(TTC7Protocol.java:1093
at oracle.jdbc.driver.OracleStatement.executeNonQuery(OracleStatement.jav
a:2047)
at oracle.jdbc.driver.OracleStatement.doExecuteOther(OracleStatement.java
:1940)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatemen
t.java:2709)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePrepare
dStatement.java:589)
at TOPLink.Private.DatabaseAccess.DatabaseAccessor.executeDirectNoSelect(
DatabaseAccessor.java:906)
at TOPLink.Private.DatabaseAccess.DatabaseAccessor.executeNoSelect(Databa
seAccessor.java:960)
at TOPLink.Private.DatabaseAccess.DatabaseAccessor.executeCall(DatabaseAc
cessor.java:819)
at TOPLink.Public.PublicInterface.UnitOfWork.executeCall(UnitOfWork.java:Hello Yury,
I believe the problem is that TopLink's ServerSession by default executes read queries concurrently on the same connection. It does this to reduce the number of required connections for the read connection pool. Normally this is a good concurrency optimization, however some JDBC drivers have issues when this is done. I had thought that with Oracle JDBC 8.1.7 this issue no longer occurred, but perhaps it is only after version 9. I believe that the errors were only with the thin JDBC driver, not the OCI, so using the OCI driver should also resolve the problem. Using RAW instead of BLOB would also work.
You can configure TopLink to resolve this problem through using exclusive read connection pooling.
Example:
serverSession.useExclusiveReadConnectionPool(int minNumerOfConnections, int maxNumerOfConnections);
This will ensure that TopLink does not to try to concurrently execute read queries on the same connection.
I'm not exactly sure what your second problem with the 9.x JDBC drivers is. From the SQL and stack trace it would seem that the driver does not like the JDBC timestamp syntax. You can have TopLink print timestamp in Oracle's native SQL format to resolve this problem.
Example:
serverSessoin.getLogin().useNativeSQL();
Make sure you configure your server session before you login, or if using the TopLink Session Manager perform the customizations through a SessionEventListener-preLogin event. -
CRVS2010 beta - Date field from database does not display in report
Hi there - can someone please help?!
I am getting a problem where a date field from the database does not display in the report viewer (It displays on my dev machine, but not on the client machines...details given below)
I upgraded to VS 2010
I am using the CRVS2010 Beta
My development machine is Windows 7 - and so is my fellow developer's
We are using Microsoft SQL Server 2000
We run the queries within VS and then we send the data table to VS using .SetDataSource
We have a few reports which display the date on our dev machines (whether we run the EXE or from within Visual Studio)
When we roll out to the client machines (running Windows XP SP3) then everything works, except that the date does not display (on quite a few reports)
This is the only real issue I have had - a show stopper for me
The rest works well - any input will be greatly appreciated
Regards,
RidwanHi Ridwan,
After much testing I have it all working now using CRDB_adoplus.dll as a data source ( XML )
Alter your Config file to look like this:
<startup useLegacyV2RuntimeActivationPolicy="true">
<supportedRuntime version="v4.0"/>
</startup>
Then using the code below, and CR requires the Schema to be able to read the date format.
private void SetToXML_Click(object sender, EventArgs e)
CrystalDecisions.CrystalReports.Engine.ReportDocument rpt = new CrystalDecisions.CrystalReports.Engine.ReportDocument();
ISCDReportClientDocument rcd;
rcd = rptClientDoc;
string connString = "Provider=SQLOLEDB;Data Source=dwcb12003;Database=xtreme;User ID=sb;Password=password";
string sqlString = "Select * From Orders";
OleDbConnection oleConn = new OleDbConnection(connString);
OleDbDataAdapter oleAdapter = new OleDbDataAdapter(sqlString, oleConn);
//OleDbDataAdapter oleAdapter2 = new OleDbDataAdapter(sqlString2, oleConn);
DataTable dt1 = new DataTable("Orders");
oleAdapter.Fill(dt1);
System.Data.DataSet ds = new System.Data.DataSet();
// We need the schema to get the data formats
ds.WriteXml("c:
sc.xml", XmlWriteMode.WriteSchema);
//Create a new Database Table to replace the reports current table.
CrystalDecisions.ReportAppServer.DataDefModel.Table boTable = new CrystalDecisions.ReportAppServer.DataDefModel.Table();
//boMainPropertyBag: These hold the attributes of the tables ConnectionInfo object
PropertyBag boMainPropertyBag = new PropertyBag();
//boInnerPropertyBag: These hold the attributes for the QE_LogonProperties
//In the main property bag (boMainPropertyBag)
PropertyBag boInnerPropertyBag = new PropertyBag();
//Set the attributes for the boInnerPropertyBag
boInnerPropertyBag.Add("File Path ", @"C:\sc.xml");
boInnerPropertyBag.Add("Internal Connection ID", "{680eee31-a16e-4f48-8efa-8765193dccdd}");
//Set the attributes for the boMainPropertyBag
boMainPropertyBag.Add("Database DLL", "crdb_adoplus.dll");
boMainPropertyBag.Add("QE_DatabaseName", "");
boMainPropertyBag.Add("QE_DatabaseType", "");
//Add the QE_LogonProperties we set in the boInnerPropertyBag Object
boMainPropertyBag.Add("QE_LogonProperties", boInnerPropertyBag);
boMainPropertyBag.Add("QE_ServerDescription", "NewDataSet");
boMainPropertyBag.Add("QE_SQLDB", "False");
boMainPropertyBag.Add("SSO Enabled", "False");
//Create a new ConnectionInfo object
CrystalDecisions.ReportAppServer.DataDefModel.ConnectionInfo boConnectionInfo =
new CrystalDecisions.ReportAppServer.DataDefModel.ConnectionInfo();
//Pass the database properties to a connection info object
boConnectionInfo.Attributes = boMainPropertyBag;
//Set the connection kind
boConnectionInfo.Kind = CrConnectionInfoKindEnum.crConnectionInfoKindCRQE;
//*EDIT* Set the User Name and Password if required.
boConnectionInfo.UserName = "";
boConnectionInfo.Password = "";
//Pass the connection information to the table
boTable.ConnectionInfo = boConnectionInfo;
//Get the Database Tables Collection for your report
CrystalDecisions.ReportAppServer.DataDefModel.Tables boTables;
boTables = rptClientDoc.DatabaseController.Database.Tables;
//For each table in the report:
// - Set the Table Name properties.
// - Set the table location in the report to use the new modified table
boTable.Name = "Orders";
boTable.QualifiedName = "Orders";
boTable.Alias = "Orders";
rptClientDoc.DatabaseController.SetTableLocation(boTables[0], boTable);
//Verify the database after adding substituting the new table.
//To ensure that the table updates properly when adding Command tables or Stored Procedures.
rptClientDoc.VerifyDatabase();
MessageBox.Show("Data Source Set", "RAS", MessageBoxButtons.OK, MessageBoxIcon.Information);
Thanks again
Don -
URGENT!Can I user a THIN jdbc driver to access a CLOB field from oracle 8.0.5 DB?
I think you'd need to contact Oracle support to get access to older versions of the driver.
Since 8.0.5 isn't supported any longer, however, is it possible for you to update your Oracle client to one of the supported releases-- 8.1.7 or 9i?
Justin -
Data transfer from oracle database(external database) to sap database
Hi Abapers,
I have a requirment like excel data upload from oracle database(external database) to SAP database .Plz any one can help me how to create a bdc program for this requarment.
Advance thanks you.
JnanaHI
use function module 'ASLM_EXCEL_TO_INTERNAL_TABLE' TO UPLOAD DATA FROM EXCEL SHEET TO INTERNAL TABLE. FROM THERE U CAN SAVE IT TO DATABASE USING BDC FUNCTION MODULES. -
DATA EXTRACTION FROM ORACLE & SYBASE
Hello members...
Good day. This is my first posting in this site. I am new to Oracle and have a question regarding Data extraction from Oracle and Sybase.
My project has two applications one having Oracle as the database and the other has Sybase as the database. On the proposed production schedule, there would be a interface data update from the Oracle database to the Sybase database.
My aim is to check if the data update is in sync after the update. We plan to extract the data from the two databases (Oracle and Sybase) and compare in a Third Party tool. We have the tool to compare. Can any of you suggest a nice data extraction tool. I have to set this data extraction as a batch in the nightly batch schedule.
Thanks in advance,
Regards,
Krishna Kumar...Sybase provides the best data extraction utility called BCP. You can load and extract data. It is very user friendly. From your unix prompt if you type in BCP ? you will all the switches and
help messages you need to use. It is so simple.
Oracle dosn't have such a utility to extract data from table.
You need to write a Stored procedure and use UTF_FILE package to
extract data from table and write into a file. -
Extract tables from "Oracle Apss 11.0.3" and load it to "Oracle Apps 11i"
Hi hussein,
I am tasked to extract the following tables from Oracle Apps 11.0.3 and load it to Oracle Apps 11.5.10.2.
PO_VENDORS
PO_VENDOR_SITES_ALL
Can I use export / import?
Thanks a lotHi,
I believe export/import of individual apps tables is not supported due to objects dependency and refrential integrity.
Regards,
Hussein -
Email from iphoto always starts mail app and wants ,mac info - i dont use
I am trying to send photos from iphoto however it always defaults to mail and wants a .mac logon however i use entourage via exchange for my email. Please help as i have to manually find the pic and attach to entourage i am sure their is a better way.
In iPhoto preferences, the General (default) tab has a menu near the bottom "Email pictures using" and "Mail" is the default. You can change that to Entourage, and it will then default to Entourage.
-
I have imported a csv from my mortgage repayments, my father and I pay seperate amounts. I am trying to show how much I have paid vs what he has paid, how much is owed and what % we own.. We are 50/50 on the property.. Help?
Hi Spears,
I not sure what your actual question is.
Does the CSV file include only your payments? Your payments and your dad's payments as separate items?
Do you want to com[are ongoing payments, or do you want to compare the current totals paid by each? Is the 'we' in "what we own" collective, or individual? DItto for "how much is owed."
Too many loose ends to tie down a clear answer.
Regards,
Barry -
How do I remove my credit card from my account? Right now I have debit or credit card and when I try to download a free app like Facebook, it tells me to update my billing info and won't accept it blank. It says I owe for a previous purchase.
Sorry- edited:
How do I remove my credit card from my account? Right now I have no* debit or credit card and when I try to download a free app like Facebook, it tells me to update my billing info and won't accept it blank. It says I owe for a previous purchase. So take it back and unlock my ability to DL free apps. -
Loading Date fields from flat file to Oracle tables
Hi,
I have a flat file with a few date columns. I have given the format for the date field while creating the data store for the flat file. The format I used is 'YYYY-MM-DD'.
But I get an error when I execute it after associating the module with LKM & IKM.
The error message is as follows:
org.apache.bsf.BSFException: exception from Jython: Traceback (innermost last):
File "<string>", line 3, in ?
OS command has signalled errors
Can anyone help me out.
ThanksAt the time of DataStore creation. You take that date field as string.
After this, in interface, when you map this date field with table field. You should use CONVERT(<field name>,DATE) in expression editor.
Maybe you are looking for
-
Error while loading Flat file to the table (ORA-00936: missing expression)
lat file Hi Gurus Receiving the following error while trying to load of flat file to the database : ODI-1228: Task test_file_load (Integration) fails on the target ORACLE connection DEMO_STAGE. Caused By: java.sql.SQLSyntaxErrorException: ORA-00936:
-
I can't figure this out.
Hi I have a code as follows : I have put this code in a WHEN-BUTTON-PRESSED-TRIGGER =====================CODE 1============================ DECLARE temp_id NUMBER; BEGIN -- check if the value of GLOBAL VAR selected_node exists. -- Yes It does exist.
-
Interface Failing when ever we do maintanace!
Dear All, We are facing a Big problem everytime in our prduction server please help! Problem: SENDER SERVER:RFC RECEIVER SERVER: IDOC MIDDLEWARE:+XI 3.0 with latest patch+ Every week when ever we restart our Receiver Server (for maint
-
G710 hard drive read/write speed
Hello, I have a brand new Ideapad G710 with 1TB Serial ATA 2, 5400rpm with advertised speed of 3 Gb/s (= 384MB/s). I am running Windows 7 ultimate x64 with SP1. When I test hard drive read/write speed (I used CrystalDiskMark) I get results of about
-
Ldap schema extension to control which users / group are imported
Hello, would like to have your opinion: would it be a good idea to implement ldap schema extensions to control which users / group are imported and controlled from ldap in a ldap mastered installation? e.g. we could implement the following schema ext