How to regularly load data from .csv file to database (using apex)
Hi,
i am using apex3 , I need to load data from a csv file to apex . I need to perform this automatically through code at regular time interval of 5-10 seconds.
Is it possible .If yes how ?. Please reply as early as possible. This will decide whether to use apex or not for this application.
this is question for Application Express. Dont know why in forum for BPEL
Edited by: TEJU on Oct 24, 2008 2:57 PM
Hello,
You really need to load the data every 5-10 seconds? Presumably it's read only?
I would look at using an Oracle external table instead, that way you just drop your CSV in a location the DB can read it and then you build a table that essentially references the underlying CSV (that way when you query the table you view the data in the CSV file).
Take a look at this link for a quick example on usage -
http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
Hope this helps,
John.
Blog: http://jes.blogs.shellprompt.net
Work: http://www.apex-evangelists.com
Author of Pro Application Express: http://tinyurl.com/3gu7cd
REWARDS: Please remember to mark helpful or correct posts on the forum, not just for my answers but for everyone!
Similar Messages
-
Loading data from .csv file into existing table
Hi,
I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
timesheet_entry_id,time_worked,timesheet_date,project_key .
The csv columns are :
project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
Thanks,
AnjaliHi Anjali,
Take a look at these threads which might outline different ways to do it -
File Browse, File Upload
Loading CSV file using external table
Loading a CSV file into a table
you can create hidden items in the page to validate previous records before insert data.
Hope this helps,
M Tajuddin
http://tajuddin.whitepagesbd.com -
Error when executing interface which load data from csv file which has 320
Hi,
Can some one provide a resolution for below error:
I have created an interface which load data from csv file which has 320 columns, to a Synonym which has 320 columns in it
using LKM File to SQL, IKM Sql Control Append.
I am getting below error when executing the interface :
com.sunopsis.tools.core.exception.SnpsSimpleMessageException: ODI-17517: Error during task interpretation. Task: 6 java.lang.Exception: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location> BSF info: Create external table at line: 0 column: columnNo
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:485)
at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:711)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)
Caused by: org.apache.bsf.BSFException: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location>
BSF info: Create external table at line: 0 column: columnNo
at bsh.util.BeanShellBSFEngine.eval(Unknown Source)
at bsh.util.BeanShellBSFEngine.exec(Unknown Source)
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:471)
... 11 more
Text: The application script threw an exception: java.lang.StringIndexOutOfBoundsException: String index out of range: 2 BSF info: Create external table at line: 0 column: columnNo
out.print("createTblCmd = r\"\"\"\ncreate table ") ;
out.print(odiRef.getTable("L", "COLL_NAME", "W")) ;
out.print("<?=(extTabColFormat.getUseView())?\"_ET\":\"\"?>\n(\n\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
"<?=extTabColFormat.getExtTabDataType(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022[DEST_WRI_DT]\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t", "","")) ;
out.print("\n)\nORGANIZATION EXTERNAL\n(\n\tTYPE ORACLE_LOADER\n\tDEFAULT DIRECTORY dat_dir\n\tACCESS PARAMETERS\n\t(\n\t\tRECORDS DELIMITED BY 0x'") ;
out.print(odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")) ;
out.print("'\n\t\t") ;
out.print(odiRef.getUserExit("EXT_CHARACTERSET")) ;
out.print("\n\t\t") ;
out.print(odiRef.getUserExit("EXT_STRING_SIZE")) ;
out.print("\n\t\tBADFILE\t\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.bad'\n\t\tLOGFILE\t\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.log'\n\t\tDISCARDFILE\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.dsc'\n\t\tSKIP \t\t") ;
out.print(odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")) ;
out.print("\n") ;
if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {out.print("\n\t\tFIELDS\n\t\t") ;
out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
out.print("\n\t\t(\n\t\t\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\tPOSITION([FILE_POS]:[FILE_END_POS])\\t"+
"<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t\\t\\t", "","")) ;
out.print("\t\t\n\t\t)\n\t)\n") ;
} else {out.print("\n\t\tFIELDS TERMINATED BY x'") ;
out.print(odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")) ;
out.print("'\n\t\t") ;
if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){out.print("\n\t\t") ;
} else {out.print("OPTIONALLY ENCLOSED BY '") ;
out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)) ;
out.print("' AND '") ;
out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)) ;
out.print("' ") ;
}out.print("\n\t\t") ;
out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
out.print("\n\t\t(\n\t\t\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
"<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t\\t\\t", "","")) ;
out.print("\t\t\n\t\t)\n\t)\n") ;
}out.print("\tLOCATION (") ;
out.print(odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")) ;
out.print(")\n)\n") ;
out.print(odiRef.getUserExit("EXT_PARALLEL")) ;
out.print("\nREJECT LIMIT ") ;
out.print(odiRef.getUserExit("EXT_REJECT_LIMIT")) ;
out.print("\n\"\"\"\n \n# Create the statement\nmyStmt = myCon.createStatement()\n \n# Execute the trigger creation\nmyStmt.execute(createTblCmd)\n \nmyStmt.close()\nmyStmt = None\n \n# Commit, just in case\nmyCon.commit()") ;
****** ORIGINAL TEXT ******
createTblCmd = r"""
create table <%=odiRef.getTable("L", "COLL_NAME", "W")%><?=(extTabColFormat.getUseView())?"_ET":""?>
<%=odiRef.getColList("", "[CX_COL_NAME]\t"+
"<?=extTabColFormat.getExtTabDataType(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022[DEST_WRI_DT]\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t", "","")%>
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY dat_dir
ACCESS PARAMETERS
RECORDS DELIMITED BY 0x'<%=odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")%>'
<%=odiRef.getUserExit("EXT_CHARACTERSET")%>
<%=odiRef.getUserExit("EXT_STRING_SIZE")%>
BADFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.bad'
LOGFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.log'
DISCARDFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.dsc'
SKIP <%=odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")%>
<% if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {%>
FIELDS
<%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
<%=odiRef.getColList("", "[CX_COL_NAME]\tPOSITION([FILE_POS]:[FILE_END_POS])\t"+
"<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t\t\t", "","")%>
<%} else {%>
FIELDS TERMINATED BY x'<%=odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")%>'
<% if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){%>
<%} else {%>OPTIONALLY ENCLOSED BY '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)%>' AND '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)%>' <%}%>
<%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
<%=odiRef.getColList("", "[CX_COL_NAME]\t"+
"<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t\t\t", "","")%>
<%}%> LOCATION (<%=odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")%>)
<%=odiRef.getUserExit("EXT_PARALLEL")%>
REJECT LIMIT <%=odiRef.getUserExit("EXT_REJECT_LIMIT")%>
# Create the statement
myStmt = myCon.createStatement()
# Execute the trigger creation
myStmt.execute(createTblCmd)
myStmt.close()
myStmt = None
# Commit, just in case
myCon.commit().
at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:738)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)The issue is encountered because the text delimiter used in the source file did not consist of a pair of delimiters.
Please see support Note [ID 1469977.1] for details. -
How to load data from UTL file to database
Hi All,
I am new in this technologies.
I am facing below problem to load data from utl file to database .
below is the script written by me :
CREATE OR REPLACE PROCEDURE load_data AS
v_line VARCHAR2(2000);
v_file UTL_FILE.FILE_TYPE;
v_dir VARCHAR2(250);
v_filename VARCHAR2(50);
v_1st_Comma NUMBER;
v_2nd_Comma NUMBER;
v_deptno NUMBER;
v_dname VARCHAR2(14);
v_loc VARCHAR2(13);
BEGIN
v_dir := ':f/rashi/dataload';
v_filename := 'fake.txt';
v_file := UTL_FILE.FOPEN(v_dir, v_filename, 'r');
LOOP
BEGIN
UTL_FILE.GET_LINE(v_file, v_line);
EXCEPTION
WHEN no_data_found THEN
exit;
END;
v_1st_Comma := INSTR(v_line, ',' ,1 , 1);
v_2nd_Comma := INSTR(v_line, ',' ,1 , 2);
v_deptno := SUBSTR(v_line, 1, v_1st_Comma-1);
v_dname := SUBSTR(v_line, v_1st_Comma+1, v_2nd_Comma-v_1st_Comma-1);
v_loc := SUBSTR(v_line, v_2nd_Comma+1);
DBMS_OUTPUT.PUT_LINE(v_deptno || ' - ' || v_dname || ' - ' || v_loc);
INSERT INTO don
VALUES (v_deptno, UPPER(v_dname), UPPER(v_loc));
END LOOP;
UTL_FILE.FCLOSE(v_file);
COMMIT;
END;
show error
I am getting the below errors:
LINE/COL ERROR
3/8 PL/SQL: Item ignored
3/8 PLS-00201: identifier 'UTL_FILE' must be declared
15/1 PL/SQL: Statement ignored
15/1 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
20/1 PL/SQL: Statement ignored
20/19 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
36/1 PL/SQL: Statement ignored
LINE/COL ERROR
36/17 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
could anyone please advice me on above error.
thanx.First of all, is file located on (or is accessible from) database server? If not - you can't load it neither with UTL_FILE nor with external table. You need to use client side tool like SQL*Loader or write your own client side load. If file is accessible from database server, then, as sb92075 noted, external table (assuming you are not on some ancient Oracle version) is the way to go.
SY. -
Loading data from .csv file into Oracle Table
Hi,
I have a requirement where I need to populate data from .csv file into oracle table.
Is there any mechanism so that i can follow the same?
Any help will be fruitful.
Thanks and regardsYou can use Sql Loader or External tables for your requirement
Missed Karthick's post ...alredy there :)
Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM -
Loading Data From .CSV files
Hi Gurus,
I m new to BW
Can u please help me out solving the below problem
When i was trying to load data from a flat file (.CSV), which is like this
Cust no, Cust name, Cust address
cust001,coca cola, Inc., 17 north road, Hyderabad - 35
infoObjects are
ZS_CUST,ZS_CUSTNM,ZS_CUSTAD
I am getting the first three values, into my attribute table, those seperated by commas
i.e cust001, coca cola and Inc into three fields in the attribute table (and PSA).
Now my piont is with out replacing the seperator ',' to any thing else
(If we have millions of records in the file and assuming that it will be a time consuming process to change ',' to some other seperator) can we extract the complete data into the fields in attribute table respectively.
Thanks
SuriSurender,
Once you specify in BW "," as the field separator, it will be interpreted as such. Of course, "Coca Cola, Inc" will be interpreted and loaded as two separate fields.
"Data should be fixed in the source, whenever is possible". That's a phrase you'll hear a lot when dealing with BW, especially when loading data from flat files.
Good luck.
Regards,
Luis -
Loading data from CSV to Unix database
Hi All
We had copied CSV into UNIX box and tried to upload data from CSV to oracle. We are able to load data from CSV's but some CSV are loading perfectly, but are loading extra character into the column.
Even I tried by putting the CSV files in windows and load the data into UNIX database, I am facing the same problem.
But if I use the same CSV's and load data in Windows database, It is working fine.
Can anybody suggest me the solution.
Regards,
Kumar.... oh, what a confusion. I still answerded in the ittoolbox group:
"Hi,
the problem are the different character sets in the windows and in the unix environment. So some of the characters are missinterpreted.
Is the database where you first loaded the csv's from a unix box also on unix? How did you copied the files? Via ftp? I hope in ascii mode? "
Regards,
Detlef -
Uploading csv file into database using apex
Dear all
I am using apex 4 and oracle express 10g, i need to upload .csv file into the database for one of my appls, i have referred discussion forum for solutions, i found also, but some how its not working for me.
below mentioned is error and the code
ERROR:
ORA-06550: line 38, column 8: PLS-00221: 'V_DATA_ARRAY' is not a procedure or is undefined ORA-06550: line 38, column 8: PL/SQL: Statement ignored ORA-06550: line 39, column 8: PLS-00221: 'V_DATA_ARRAY' is not a procedure or is undefined ORA-06550: line 39, column 8: PL/SQL: Statement ignored ORA-06550: line 40, column 8: PLS-00221: 'V_DATA_ARRAY' is not a procedure or is undefined ORA-06550: line 40, column 8: PL/SQL: Statement ignored ORA-06550: line 41, column 8: PLS-00221: 'V_DATA_ARRAY' is not a proc
Error
OK
CODE:
DECLARE
v_blob_data BLOB;
v_blob_len NUMBER;
v_position NUMBER;
v_raw_chunk RAW(10000);
v_char CHAR(1);
c_chunk_len number := 1;
v_line VARCHAR2 (32767) := NULL;
v_data_array wwv_flow_global.vc_arr2;
BEGIN
-- Read data from wwv_flow_files
select blob_content into v_blob_data
from wwv_flow_files where filename = 'DDNEW.csv';
v_blob_len := dbms_lob.getlength(v_blob_data);
v_position := 1;
-- Read and convert binary to char
WHILE ( v_position <= v_blob_len ) LOOP
v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
v_line := v_line || v_char;
v_position := v_position + c_chunk_len;
-- When a whole line is retrieved
IF v_char = CHR(10) THEN
-- Convert comma to : to use wwv_flow_utilities
v_line := REPLACE (v_line, ',', ':');
-- Convert each column separated by : into array of data
v_data_array := wwv_flow_utilities.string_to_table (v_line);
-- Insert data into target table
EXECUTE IMMEDIATE 'insert into TABLE_X (v1, v2, v3, v4 ,v5, v6, v7,v8 ,v9, v10, v11)
values (:1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11)'
USING
v_data_array(1),
v_data_array(2),
v_data_array(3),
v_data_array(4);
v_data_array(5);
v_data_array(6);
v_data_array(7);
v_data_array(8);
v_data_array(9);
v_data_array(10);
v_data_array(11);
-- Clear out
v_line := NULL;
END IF;
END LOOP;
END;
what i understand from this is system does not identify v_data_array as array for some reasons, please help me.
initially system was giving error for hex_to_decimal, but i managed to get this function on discussion forum and now it seems to be ok. but v_data_array problem is still there.
thanks in advance
regards
UdayHi,
Mistakes in your sample I did correct
Problem 1
select blob_content into v_blob_data
from wwv_flow_files where filename = 'DDNEW.csv'; to
select blob_content into v_blob_data
from wwv_flow_files where name = :P1_FILE;Problem 2
EXECUTE IMMEDIATE 'insert into TABLE_X (v1, v2, v3, v4 ,v5, v6, v7,v8 ,v9, v10, v11)
values (:1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11)'
USING
v_data_array(1),
v_data_array(2),
v_data_array(3),
v_data_array(4);
v_data_array(5);
v_data_array(6);
v_data_array(7);
v_data_array(8);
v_data_array(9);
v_data_array(10);
v_data_array(11); to
EXECUTE IMMEDIATE 'insert into TABLE_X (v1, v2, v3, v4 ,v5, v6, v7,v8 ,v9, v10, v11)
values (:1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11)'
USING
v_data_array(1),
v_data_array(2),
v_data_array(3),
v_data_array(4),
v_data_array(5),
v_data_array(6),
v_data_array(7),
v_data_array(8),
v_data_array(9),
v_data_array(10),
v_data_array(11); And I did create missing table
CREATE TABLE TABLE_X
v1 VARCHAR2(255),
v2 VARCHAR2(255),
v3 VARCHAR2(255),
v4 VARCHAR2(255),
v5 VARCHAR2(255),
v6 VARCHAR2(255),
v7 VARCHAR2(255),
v8 VARCHAR2(255),
v9 VARCHAR2(255),
v10 VARCHAR2(255),
v11 VARCHAR2(255)
);Regards,
Jari
Edited by: jarola on Nov 19, 2010 3:03 PM -
Problem in loading data from flat file to info object
Hi experts,
I am new on bw. I am trying to load data from .csv file to info object. CSV file contains two colums Mat No, Mat Name and are separated by comma and similarly Mat Name is attribute of Mat No in info object. I am doing all steps by following the book but whenever I load data into info object it is loaded in one column. I mean both Mat no and Mat name apears in Mat No column whereas Mat Name column appears empty in info object. I have changed exel file into .csv. Any one can help me sort it out.
regardsHi,
your flatfile should be in the form of csv ie it does not mean that in a word u seperate it with coma
create an excel file and say save as in teh file name give the name of your flatfile and in the save as type select CSV(comma delimited) from the drop down list
and then in this file enter the data as u fill in excel sheet.
After closing the file right click it and say open with wordpad then the data should appear seperated with a comma.
Try this and then revert back in case of any doubts.........
Are u loading data into BI 7.0 or 3.x
Regards
Madhavi -
How to load the data from .csv file to oracle table???
Hi,
I am using oracle 10g , plsql developer. Can anyone help me in how to load the data from .csv file to oracle table. The table is already created with the required columns. The .csv file is having about 10lakh records. Is it possible to load 10lakh records. can any one please tell me how to proceed.
Thanks in advance981145 wrote:
Can you tell more about sql * loader??? how to know that utility is available for me or not??? I am using oracle 10g database and plsql developer???SQL*Loader is part of the Oracle client. If you have a developer installation you should normally have it on your client.
the command is
sqlldrType it and see if you have it installed.
Have a look also at the FAQ link posted by Marwin.
There are plenty of examples also on the web.
Regards.
Al -
How to create a datasource for 0COSTCENTER to load data in csv file in BI
how to create a datasource for 0COSTCENTER to load data in csv file in BI 7.0 system
can you emil me the picture of the step about how to loaded individual values of the hierarchy using CSV file
thank you very much
my emil is <Removed>
allenStep 1: Load Required Master Data for 0CostCenter in to BI system
Step 2: Enable Characteristics to support Hierarchy for this 0Cost Center and specify the External Characteristic(In the Lowest Node or Last Node) while creation of this Characteristic InfoObject
Step 3: On Last Node of Hierarchy Structure in the InfoObject, Right Click and then Create Hierarchy MANUALLY by Inserting the Master Data Value as BI dosent Support the Hierarchy load directly you need to do it manually....
Step 4: Mapping
Create Text Node thats the first node (Root Node)
Insert Characteristic Nodes
Insert the Last Node of the Hierarchy
Then you need to create a Open hub Destination for extracting data into the .csv file...
Step1 : Create the Open Hub Destination give the Master Data table name and enter all the fields required....and create the transformations for this Open Hub connecting to the External file or excel file source...then give the location on to your local disk or path of the server in the first tab and request for the data...It should work alright let me know if you need anything else...
Thanks,
Sandhya -
Help Me... How to get Data from CSV File...?
Hi Everyone..!
This is yajiv and am working in CS3 Photoshop platform. I know about Java Script. Is it possible to get the data from CSV files. Actually our client use to send us the CSV files which contains a lot of swatch name and reference files in one particular image name.
Actually how we work on that CSV file is, first we copy file name to search that CSV file. then get the result to paste into layer name. This process continue till the end of swatch.
Thank in Advance
-yajiv> Is it possible to get the data from CSV files.
Have you tried searching this forum? -
Loading data from flat file...
Hello,
I am actually experiencing a problem where I cant load data from a flat file into a table. I have actually reverse engineered the flat file into ODI. But the thing is I don't know how to load the data that is reversed into a RDBMS table. Added, I don't know how create this RDBMS table form within ODI to be reflected on the DB. Added, don't know how to load the data from the flat file onto this table without having to add the columns in order to map between the flat file and the table.
In conclusion, I need to know how to create an RDBMS table from within ODI on the database. And how to automatically map the flat file to the DB table and load the data into the DB table.
Regards,
HossamHi Hossam,
We can used ODI procedure to create table in the DB.
Make sure you keep the column name in the table name same as the column name in FLAT FILE so that it can automatically map the column.
and regarding Loading data from FLAT File i.e. our source table is FLAT FILE till ODI 10.1.3.4 we need to manually insert the datastore since the file system cannot be reversed.
Please let me know Hossam if i can assis you further.
Thanks and Regards,
Andy -
SQL loaded not loading data from csv format to oracle table.
Hi,
I trying to load data from flat files to oracle table,it not loading decimal values.Only character type of data is loaded and number format is null.
CONTROL FILE:
LOAD DATA
INFILE cost.csv
BADFILE consolidate.bad
DISCARDFILE Sybase_inventory.dis
INSERT
INTO TABLE FIT_UNIX_NT_SERVER_COSTS
FIELDS TERMINATED BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
HOST_NM,
SERVICE_9071_DOLLAR FLOAT,
SERVICE_9310_DOLLAR FLOAT,
SERVICE_9700_DOLLAR FLOAT,
SERVICE_9701_DOLLAR FLOAT,
SERVICE_9710_DOLLAR FLOAT,
SERVICE_9711_DOLLAR FLOAT,
SERVICE_9712_DOLLAR FLOAT,
SERVICE_9713_DOLLAR FLOAT,
SERVICE_9720_DOLLAR FLOAT,
SERVICE_9721_DOLLAR FLOAT,
SERVICE_9730_DOLLAR FLOAT,
SERVICE_9731_DOLLAR FLOAT,
SERVICE_9750_DOLLAR FLOAT,
SERVICE_9751_DOLLAR FLOAT,
GRAND_TOTAL FLOAT
In table FLOAT are replaced by number(20,20)
SAmple i/p from csv:
ABOS12,122.46,,1315.00,,1400.00,,,,,,,,1855.62,,4693.07
Only abos12 is loaded and rest of the number are not loaded.
Thanks.Hi,
Thanks for your reply.
Its throwing error.
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table FIT_UNIX_NT_SERVER_COSTS, loaded from every logical record.
Insert option in effect for this table: INSERT
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
HOST_NM FIRST 255 , O(") CHARACTER
SERVICE_9071_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9310_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9700_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9701_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9710_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9711_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9712_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9713_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9720_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9721_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9730_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9731_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9750_DOLLAR NEXT 255 , O(") CHARACTER
SERVICE_9751_DOLLAR NEXT 255 , O(") CHARACTER
GRAND_TOTAL NEXT 255 , O(") CHARACTER
value used for ROWS parameter changed from 64 to 62
Record 1: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 2: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 3: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 4: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 5: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 6: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 7: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 8: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 9: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 10: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Record 11: Rejected - Error on table FIT_UNIX_NT_SERVER_COSTS, column GRAND_TOTAL.
ORA-01722: invalid number
Please help me on it.
Thanks, -
Load data from XML files into BI
Hi All,
Can any one guide me through the steps which involve in loading data from XML files into BI?
Thanks
SantoshHi James,
I have followed upto step No: 12 .
Please could you let me know how to link the XML file on my local drive to BI,
I am not able to find figure out where to specify the path of the XML file to be loaded into BI.
Thanks In Advance
Regards,
San
1. Create a Infosource(ZIS), with transfer structure in accordance with the structure in the XML file. (You can open the XML file with MS Excel.This way you can get to know the structure).
2. Activate the transfer rules.
3. After activation ;from the Menu Bar , Select Extras>Create BW Datasource with SOAP connection.
4.Now Activate the Infosurce again, this creates an XML based Datasource(6AXXX...)
5.These steps would create two sub nodes under the Infosource(ZIS).
6.Select Myself system and create a data package and execute it run in Init mode(without Data Transfer).
7. This would create an entry in RSA7(BW delta Queue)
8. Again create another Delta Infopackage under it, and run it. Now the Datasource(6AXXXXXX..) would turn green in RSA7.
9.In Function builder(SE37) select your FM( do a search ,F4, on the datasource 6AXXX....) .
10.Inside this RFC based FM , from the Menu Bar select Utilities>more Utilities>Create Web services>From Function module.
11.A wizard will guide you through the next steps .
12.once this is done a Web serrvice would be enabled in WSADMIN. Select your FM and execute it.
13.From here you can upload the data from XML file into BW delta queue.
Edited by: Santosh on Nov 30, 2008 2:22 PM
Maybe you are looking for
-
Setting Current Values as Default in Stand-Alone Application ?
Hi I have a stand-alone application using LabVIEW run-time and I want the user to be able to make the choice if he wants to retain the default values or modify them ? I know you can't use the invoke node and set the current values to default because
-
Installing on more than 1 computer
I have Lightroom 3.6 installed on my home PC and my laptop which I use when travelling. I am about to take delivery of a new home Pc on which I want to install Lightroom. Question: Do I need to uninstall from my old PC first before installing on the
-
Why does my login screen when scrolling mouse over appear pixalated (has small squares wherver the mouse scrolls until the entire page has been scratched like a lotto ticket:) ?
-
Installing Mac OS X 10.4.7 DVD disk to a 2nd Hand G5 imac?
Hi trying to install my version of 10.4.7 which works on my ibook G4. But I recently bought a 2nd Hand G5 iMac 1.83 Ghz Intel duo. I loaded the disk and it does'nt seem to want to work? It's saying the destination is not suitable? Would I need to buy
-
[solved] HAL fails repeatedly..
My HAL is failing on every boot and if I try to start it manually: [shane@Shane-Arch log]$ sudo /etc/rc.d/hal start :: Starting Hardware Abstraction Layer [FAIL] Also... there is something else failing on boot, but it goes too fast. I can't seem to