Loading data from multiple files to multiple tables
How should I approach on creating SSIS package to load data from multiple files to multiple tables. Also, Files will have data which might overlap so I might have to create stored procedure for it. Ex. 1st day file -data from au.1 - aug 10 and 2nd day
file might have data from aug.5 to aug 15. So I might have to look for max and min date and truncate table with in that date range.
thats ok. ForEachLoop would be able to iterate through the files. You can declare a variable inside loop to capture the filenames. Choose fully qualified as the option in loop
Then inside loop
1. Add execute sql task to delete overlapping data from the table. One question here is where will you get date from? Does it come inside filename?
2. Add a data flow task with file source pointing to file .For this add a suitable connection manager (Excel/Flat file etc) and map the connection string property to filename variable using expressions
3. Add a OLEDB Destination to point to table. You can use table or view from variable - fast load option and map to variable to make tablename dynamic and just set corresponding value for the variable to get correct tablename
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs
Similar Messages
-
Loading data from .csv file into Oracle Table
Hi,
I have a requirement where I need to populate data from .csv file into oracle table.
Is there any mechanism so that i can follow the same?
Any help will be fruitful.
Thanks and regardsYou can use Sql Loader or External tables for your requirement
Missed Karthick's post ...alredy there :)
Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM -
Loading data from .csv file into existing table
Hi,
I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
timesheet_entry_id,time_worked,timesheet_date,project_key .
The csv columns are :
project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
Thanks,
AnjaliHi Anjali,
Take a look at these threads which might outline different ways to do it -
File Browse, File Upload
Loading CSV file using external table
Loading a CSV file into a table
you can create hidden items in the page to validate previous records before insert data.
Hope this helps,
M Tajuddin
http://tajuddin.whitepagesbd.com -
Load data from a file with multiple record types to a single table-sqlldr
We are using two datastores which refer to the same file. The file has 2 types of records header and detail.
h011234tyre
d01rey5679jkj5679
h011235tyrr
d01rel5678jul5688
d01reh5698jll5638
Can someone help in loading these lines from one file with only two data stores(not 2 separate files) using File to Oracle(SQLLDR) Knowledge Module.Hi,
Unfortunately the IKM SQLDR doesn't have the "when" condition to be wrote at ctl file.
If you wish a simple solution, just add an option (drop me a email if you want a LKM with this)
The point is:
With a single option, you will control the when ctl clause and, for instance, can define:
1) create 2 datastores (1 for each file)
2) the first position will be a column at each datastore
3) write the when condition to this first column at the LKM in the interface.
Does it help you? -
Load data from flat file to target table with scheduling in loader
Hi All,
I have requirement as follows.
I need to load the data to the target table on every Saturday. My source file consists of dataof several sates.For every week i have to load one particular state data to target table.
If first week I loaded AP data, then second week on Saturday karnatak, etc.
can u plz provide code also how can i schedule the data load with every saturday with different state column values automatically.
Thanks & Regards,
SekharThe best solution would be:
get the flat file to the Database server
define an External Table pointing to that flat file
insert into destination table(s) as select * from external_table
Loading only single state data each Saturday might mean troubles, but assuming there are valid reasons to do so, you could:
create a job with a p_state parameter executing insert into destination table(s) as select * from external_table where state = p_state
create a Scheduler chain where each member runs on next Saturday executing the same job with a different p_stateparameter
Managing Tables
Oracle Scheduler Concepts
Regards
Etbin -
SQL loaded not loading data from csv format to oracle table.
Hi,
I trying to load data from flat files to oracle table,it not loading decimal values.Only character type of data is loaded and number format is null.
CONTROL FILE:
LOAD DATA
INFILE cost.csv
BADFILE consolidate.bad
DISCARDFILE Sybase_inventory.dis
INSERT
INTO TABLE FIT_UNIX_NT_SERVER_COSTS
FIELDS TERMINATED BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
HOST_NM,
SERVICE_9071_DOLLAR FLOAT,
SERVICE_9310_DOLLAR FLOAT,
SERVICE_9700_DOLLAR FLOAT,
SERVICE_9701_DOLLAR FLOAT,
SERVICE_9710_DOLLAR FLOAT,
SERVICE_9711_DOLLAR FLOAT,
SERVICE_9712_DOLLAR FLOAT,
SERVICE_9713_DOLLAR FLOAT,
SERVICE_9720_DOLLAR FLOAT,
SERVICE_9721_DOLLAR FLOAT,
SERVICE_9730_DOLLAR FLOAT,
SERVICE_9731_DOLLAR FLOAT,
SERVICE_9750_DOLLAR FLOAT,
SERVICE_9751_DOLLAR FLOAT,
GRAND_TOTAL FLOAT
In table FLOAT are replaced by number(20,20)
SAmple i/p from csv:
ABOS12,122.46,,1315.00,,1400.00,,,,,,,,1855.62,,4693.07
Only abos12 is loaded and rest of the number are not loaded.
Thanks.Hi,
Thanks for your reply.
Its throwing error.
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table FIT_UNIX_NT_SERVER_COSTS, loaded from every logical record.
Insert option in effect for this table: INSERT
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
HOST_NM FIRST 255 , O(") CHARACTER
SERVICE_9071_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9310_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9700_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9701_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9710_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9711_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9712_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9713_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9720_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9721_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9730_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9731_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9750_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9751_DOLLAR NEXT 255 , O(") CHARACTER
GRAND_TOTAL NEXT 255 , O(") CHARACTER
value used for ROWS parameter changed from 64 to 62
Record 1: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 2: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 3: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 4: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 5: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 6: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 7: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 8: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 9: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 10: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 11: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Please help me on it.
Thanks, -
How to load the data from .csv file to oracle table???
Hi,
I am using oracle 10g , plsql developer. Can anyone help me in how to load the data from .csv file to oracle table. The table is already created with the required columns. The .csv file is having about 10lakh records. Is it possible to load 10lakh records. can any one please tell me how to proceed.
Thanks in advance981145 wrote:
Can you tell more about sql * loader??? how to know that utility is available for me or not??? I am using oracle 10g database and plsql developer???SQL*Loader is part of the Oracle client. If you have a developer installation you should normally have it on your client.
the command is
sqlldrType it and see if you have it installed.
Have a look also at the FAQ link posted by Marwin.
There are plenty of examples also on the web.
Regards.
Al -
Loading data from flat file to table
Hi,
I am using OWB 10g R2. I have to develop a map which loads data from flat file to table.
1. I have created a .csv file.
2. Import that into OWB by Module Wizard.
3. I have created one external table, which use the .csv file.
After that what I have to do for successfully loading the data into table.
Regards,
SanjoyHi Sanjoy
You can building a mapping to move the data from somewhere to somewhere else. More specifically;
1. create a mapping
2. add external table to mapping
3. add table to mapping for target (it can be unbound)
4. map from external table ingroup to target table ingroup (all columns will be mapped)
5. if target table was unbound right click on target table and click create and bind
This is a basic mapping you can have for this scenario. There is a post here which does something like this, but it will create a File to Table mapping which will use SQLLoader.
http://blogs.oracle.com/warehousebuilder/2009/01/expert_useful_scripts_from_simple_primitives.html
Cheers
David -
Upload data from excel file to Oracle table
Dear All,
I have to upload data from excel file to Oracle table without using third party tools and without converting into CSV file.
Could you tell me please how can i do this using PLSQl or SQL Loader.
Thnaks in Advance..Dear All,
I have to upload data from excel file to
Oracle table without using third party tools and
without converting into CSV file.
Could you tell me please how can i do this
using PLSQl or SQL Loader.
Thnaks in Advance..As billy mentioned using ODBC interface ,the same HS service which is a layer over using traditional ODBC to access non oracle database.Here is link you can hit and trial and come out here if you have any problem.
http://www.oracle-base.com/articles/9i/HSGenericConnectivity9i.php[pre]
Khurram -
Load data from xml file in oracle data base
Hi all,
I'd like to know if is posible to load data from a file xml into a table of oracle data base through SQL*LOADER, loaded only in a normal column no with type XMLType , for example
I have a xml file
<person name="kate" surname="fari" city="new york" >
<son name="faus" age="18"/>
<son name="doly" age="10"/>
</person>
and I load in table :
table :person
column
name surname city
kate fari new york
table : son
name age
doly 10
faus 18
thank you for your return !!!!!!!!!
Ninova
Edited by: user10474037 on 30 mai 2011 08:47
Edited by: user10474037 on 30 mai 2011 08:48
Edited by: user10474037 on 30 mai 2011 08:48
Edited by: user10474037 on 30 mai 2011 08:49
Edited by: user10474037 on 30 mai 2011 08:50Hi
This May be found at
SQL Loader to upload XML file -
Loading data from flat file...
Hello,
I am actually experiencing a problem where I cant load data from a flat file into a table. I have actually reverse engineered the flat file into ODI. But the thing is I don't know how to load the data that is reversed into a RDBMS table. Added, I don't know how create this RDBMS table form within ODI to be reflected on the DB. Added, don't know how to load the data from the flat file onto this table without having to add the columns in order to map between the flat file and the table.
In conclusion, I need to know how to create an RDBMS table from within ODI on the database. And how to automatically map the flat file to the DB table and load the data into the DB table.
Regards,
HossamHi Hossam,
We can used ODI procedure to create table in the DB.
Make sure you keep the column name in the table name same as the column name in FLAT FILE so that it can automatically map the column.
and regarding Loading data from FLAT File i.e. our source table is FLAT FILE till ODI 10.1.3.4 we need to manually insert the datastore since the file system cannot be reversed.
Please let me know Hossam if i can assis you further.
Thanks and Regards,
Andy -
Error when executing interface which load data from csv file which has 320
Hi,
Can some one provide a resolution for below error:
I have created an interface which load data from csv file which has 320 columns, to a Synonym which has 320 columns in it
using LKM File to SQL, IKM Sql Control Append.
I am getting below error when executing the interface :
com.sunopsis.tools.core.exception.SnpsSimpleMessageException: ODI-17517: Error during task interpretation. Task: 6 java.lang.Exception: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location> BSF info: Create external table at line: 0 column: columnNo
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:485)
at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:711)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)
Caused by: org.apache.bsf.BSFException: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location>
BSF info: Create external table at line: 0 column: columnNo
at bsh.util.BeanShellBSFEngine.eval(Unknown Source)
at bsh.util.BeanShellBSFEngine.exec(Unknown Source)
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:471)
... 11 more
Text: The application script threw an exception: java.lang.StringIndexOutOfBoundsException: String index out of range: 2 BSF info: Create external table at line: 0 column: columnNo
out.print("createTblCmd = r\"\"\"\ncreate table ") ;
out.print(odiRef.getTable("L", "COLL_NAME", "W")) ;
out.print("<?=(extTabColFormat.getUseView())?\"_ET\":\"\"?>\n(\n\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
"<?=extTabColFormat.getExtTabDataType(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022[DEST_WRI_DT]\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t", "","")) ;
out.print("\n)\nORGANIZATION EXTERNAL\n(\n\tTYPE ORACLE_LOADER\n\tDEFAULT DIRECTORY dat_dir\n\tACCESS PARAMETERS\n\t(\n\t\tRECORDS DELIMITED BY 0x'") ;
out.print(odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")) ;
out.print("'\n\t\t") ;
out.print(odiRef.getUserExit("EXT_CHARACTERSET")) ;
out.print("\n\t\t") ;
out.print(odiRef.getUserExit("EXT_STRING_SIZE")) ;
out.print("\n\t\tBADFILE\t\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.bad'\n\t\tLOGFILE\t\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.log'\n\t\tDISCARDFILE\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.dsc'\n\t\tSKIP \t\t") ;
out.print(odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")) ;
out.print("\n") ;
if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {out.print("\n\t\tFIELDS\n\t\t") ;
out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
out.print("\n\t\t(\n\t\t\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\tPOSITION([FILE_POS]:[FILE_END_POS])\\t"+
"<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t\\t\\t", "","")) ;
out.print("\t\t\n\t\t)\n\t)\n") ;
} else {out.print("\n\t\tFIELDS TERMINATED BY x'") ;
out.print(odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")) ;
out.print("'\n\t\t") ;
if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){out.print("\n\t\t") ;
} else {out.print("OPTIONALLY ENCLOSED BY '") ;
out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)) ;
out.print("' AND '") ;
out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)) ;
out.print("' ") ;
}out.print("\n\t\t") ;
out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
out.print("\n\t\t(\n\t\t\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
"<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t\\t\\t", "","")) ;
out.print("\t\t\n\t\t)\n\t)\n") ;
}out.print("\tLOCATION (") ;
out.print(odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")) ;
out.print(")\n)\n") ;
out.print(odiRef.getUserExit("EXT_PARALLEL")) ;
out.print("\nREJECT LIMIT ") ;
out.print(odiRef.getUserExit("EXT_REJECT_LIMIT")) ;
out.print("\n\"\"\"\n \n# Create the statement\nmyStmt = myCon.createStatement()\n \n# Execute the trigger creation\nmyStmt.execute(createTblCmd)\n \nmyStmt.close()\nmyStmt = None\n \n# Commit, just in case\nmyCon.commit()") ;
****** ORIGINAL TEXT ******
createTblCmd = r"""
create table <%=odiRef.getTable("L", "COLL_NAME", "W")%><?=(extTabColFormat.getUseView())?"_ET":""?>
<%=odiRef.getColList("", "[CX_COL_NAME]\t"+
"<?=extTabColFormat.getExtTabDataType(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022[DEST_WRI_DT]\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t", "","")%>
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY dat_dir
ACCESS PARAMETERS
RECORDS DELIMITED BY 0x'<%=odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")%>'
<%=odiRef.getUserExit("EXT_CHARACTERSET")%>
<%=odiRef.getUserExit("EXT_STRING_SIZE")%>
BADFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.bad'
LOGFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.log'
DISCARDFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.dsc'
SKIP <%=odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")%>
<% if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {%>
FIELDS
<%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
<%=odiRef.getColList("", "[CX_COL_NAME]\tPOSITION([FILE_POS]:[FILE_END_POS])\t"+
"<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t\t\t", "","")%>
<%} else {%>
FIELDS TERMINATED BY x'<%=odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")%>'
<% if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){%>
<%} else {%>OPTIONALLY ENCLOSED BY '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)%>' AND '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)%>' <%}%>
<%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
<%=odiRef.getColList("", "[CX_COL_NAME]\t"+
"<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t\t\t", "","")%>
<%}%> LOCATION (<%=odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")%>)
<%=odiRef.getUserExit("EXT_PARALLEL")%>
REJECT LIMIT <%=odiRef.getUserExit("EXT_REJECT_LIMIT")%>
# Create the statement
myStmt = myCon.createStatement()
# Execute the trigger creation
myStmt.execute(createTblCmd)
myStmt.close()
myStmt = None
# Commit, just in case
myCon.commit().
at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:738)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)The issue is encountered because the text delimiter used in the source file did not consist of a pair of delimiters.
Please see support Note [ID 1469977.1] for details. -
How to load data from UTL file to database
Hi All,
I am new in this technologies.
I am facing below problem to load data from utl file to database .
below is the script written by me :
CREATE OR REPLACE PROCEDURE load_data AS
v_line VARCHAR2(2000);
v_file UTL_FILE.FILE_TYPE;
v_dir VARCHAR2(250);
v_filename VARCHAR2(50);
v_1st_Comma NUMBER;
v_2nd_Comma NUMBER;
v_deptno NUMBER;
v_dname VARCHAR2(14);
v_loc VARCHAR2(13);
BEGIN
v_dir := ':f/rashi/dataload';
v_filename := 'fake.txt';
v_file := UTL_FILE.FOPEN(v_dir, v_filename, 'r');
LOOP
BEGIN
UTL_FILE.GET_LINE(v_file, v_line);
EXCEPTION
WHEN no_data_found THEN
exit;
END;
v_1st_Comma := INSTR(v_line, ',' ,1 , 1);
v_2nd_Comma := INSTR(v_line, ',' ,1 , 2);
v_deptno := SUBSTR(v_line, 1, v_1st_Comma-1);
v_dname := SUBSTR(v_line, v_1st_Comma+1, v_2nd_Comma-v_1st_Comma-1);
v_loc := SUBSTR(v_line, v_2nd_Comma+1);
DBMS_OUTPUT.PUT_LINE(v_deptno || ' - ' || v_dname || ' - ' || v_loc);
INSERT INTO don
VALUES (v_deptno, UPPER(v_dname), UPPER(v_loc));
END LOOP;
UTL_FILE.FCLOSE(v_file);
COMMIT;
END;
show error
I am getting the below errors:
LINE/COL ERROR
3/8 PL/SQL: Item ignored
3/8 PLS-00201: identifier 'UTL_FILE' must be declared
15/1 PL/SQL: Statement ignored
15/1 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
20/1 PL/SQL: Statement ignored
20/19 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
36/1 PL/SQL: Statement ignored
LINE/COL ERROR
36/17 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
could anyone please advice me on above error.
thanx.First of all, is file located on (or is accessible from) database server? If not - you can't load it neither with UTL_FILE nor with external table. You need to use client side tool like SQL*Loader or write your own client side load. If file is accessible from database server, then, as sb92075 noted, external table (assuming you are not on some ancient Oracle version) is the way to go.
SY. -
Loading data from flat file to ODS
Hi Experts,
I have a developed a report, in which Flat file is the datasource and data target as DSO.
Here I have one requriment to Increatment the counter value in TVARVC table after every load compleate.
Can this be done thru Process chain? I belive this can be done thru process chain by creating variable.
If this can be done thru Process chains can any experts send me sample examples code.
Thanks In Advance.
Sreedhar.MHi sridhar,
We have a similar scenario in which we will load data from flat file to datasource the file name (x_20081010.dat) of such format being fetched from the low value of the TVARVC table ... once successfully executed we had program to increment the value of low which increments by one day adn the next day the file with that will be placed in the application server...
U can create a abap program varaint and attribute for that and u can achieve that.... this is sample code which searches for the string and after some coding i.e once ur file has been loaded successful code u can write similar code to increment or counter values which u are asking for...
search s_tvarvc-low for '20'.
if sy-subrc ne 0.
sy-msgty = 'E'.
sy-msgid = 'ZWM1'.
sy-msgno = '000'.
sy-msgv1 = 'Unable to find DATE to increment'.
raise unable_increment.
else.
w_date = s_tvarvc-low+sy-fdpos(08).
w_nextdate = w_date + 1.
replace w_date with w_nextdate into s_tvarvc-low.
modify tvarvc from s_tvarvc.
endif.
Regards
vamsi -
Loading Data From .CSV files
Hi Gurus,
I m new to BW
Can u please help me out solving the below problem
When i was trying to load data from a flat file (.CSV), which is like this
Cust no, Cust name, Cust address
cust001,coca cola, Inc., 17 north road, Hyderabad - 35
infoObjects are
ZS_CUST,ZS_CUSTNM,ZS_CUSTAD
I am getting the first three values, into my attribute table, those seperated by commas
i.e cust001, coca cola and Inc into three fields in the attribute table (and PSA).
Now my piont is with out replacing the seperator ',' to any thing else
(If we have millions of records in the file and assuming that it will be a time consuming process to change ',' to some other seperator) can we extract the complete data into the fields in attribute table respectively.
Thanks
SuriSurender,
Once you specify in BW "," as the field separator, it will be interpreted as such. Of course, "Coca Cola, Inc" will be interpreted and loaded as two separate fields.
"Data should be fixed in the source, whenever is possible". That's a phrase you'll hear a lot when dealing with BW, especially when loading data from flat files.
Good luck.
Regards,
Luis -
Error
[Load data from excel file [1]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There
may be error message
I am using BIDS Microsoft Visual Studio 2008 and running the package to load the data from excel .
My machine has 32 bit excel hence have set property to RUN64BITRUNTIME AS FALSE.
But the error still occurs .
I checked on Google and many have used Delay validation property at Data flow task level to true but even using it at both excel connection manager and DFT level it doesnt work
MudassarThats my connection string
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=E:\SrcData\Feeds\Utilization.xlsx;Extended Properties="Excel 12.0;HDR=NO";
Excel 2010 installed and its 32 bit edition
Are you referring to install this component -AccessDatabaseEngine_x64.exe?
http://www.microsoft.com/en-us/download/details.aspx?id=13255
Mudassar
You can try an OLEDB provider in that case
see
http://dataintegrity.wordpress.com/2009/10/16/xlsx/
you might need to download and install ms access redistributable
http://www.microsoft.com/en-in/download/details.aspx?id=13255
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs
Maybe you are looking for
-
In LO Cockpit, Job contol job is cancelled in SM37
Dear All I am facing one problem. Pl help me to resolve that issue. When ever I am scheduling the delta job for 03 (Application Component), that is cancelled in SM37. Hence, I couldn't retrive those data from queued delta MCEX03 (smq1 or lbwq) to
-
New BB Curve connected to broadband automatica​lly!!
Testing the new BB 9300 I placed my sim card into the phone. To my surprise my call history appeared on my new phone...to my even bigger surprise when I got home it connected to my home WIFI automatically even though it is Password protected. Can any
-
Why did font get bigger suddenly, and how to fix?
I found this thread but please tell me there is a better (read: proper) fix for this annoying problem! https://support.mozilla.org/en-US/questions/963279 1) I tried the about:config "fix" and it doesn't help much. 2) I shouldn't have to mess with my
-
sorry irrelavant forum...... Edited by: jyotsna dm on Oct 12, 2009 6:07 AM
-
Por que en mi ipod touch 4g recien actualizado con ios 6 , no tengo siri?
Por que en mi ipod no tengo ni siri ni fly over?