Parse a CSV file every 1hour
hi all,
i am working on this app, in which i need a parse a CSV file every 1hr. now the CSV file is average size. i need to parse the file (i will use simple stringtokenizer), organise the data in the file (using simple string manipulation) and export to some format (will worry about later). now whats the most efficient and quick way to do this.
and what about the 1hr loop, how should i implement that. pls help.
thanks.
ag2011 wrote:
hi all,
i am working on this app, in which i need a parse a CSV file every 1hr. now the CSV file is average size. i need to parse the file (i will use simple stringtokenizer), organise the data in the file (using simple string manipulation) and export to some format (will worry about later). now whats the most efficient and quick way to do this.
and what about the 1hr loop, how should i implement that. pls help.
thanks.Hi ,
Look at Quartz API ! This has very efficient job scheduling engine .
SchedulerFactory schedFact = new org.quartz.impl.StdSchedulerFactory();
Scheduler sched = schedFact.getScheduler();
sched.start();
// create trigger
Trigger trigger = TriggerUtils.makeHourlyTrigger(1); // fire every one hour
JobDetail jobDetail = new JobDetail("myJob", "MyGrp", CSVParser.class); // This class is u r actual csv parser code exists
//schedule job
sched.scheduleJob(jobDetail, trigger);
i hope it helps ,
see Quartz API for more details !
http://www.opensymphony.com/quartz/
--------Amit
Edited by: AmitChalwade123456 on Jan 5, 2009 10:57 AM
Similar Messages
-
Parsing BLOB (CSV file with special characters) into table
Hello everyone,
In my application, user uploads a CSV file (it is stored as BLOB), which is later read and parsed into table. The parsing engine is shown bellow...
The problem is, that it won't read national characters as Ö, Ü etc., they simply dissapear.
Is there any CSV parser that supports national characters? Or, said in other words - is it possible to read BLOB by characters (where characters can be Ö, Ü etc.)?
Regards,
Adam
|
| helper function for csv parsing
|
+-----------------------------------------------*/
FUNCTION hex_to_decimal(p_hex_str in varchar2) return number
--this function is based on one by Connor McDonald
--http://www.jlcomp.demon.co.uk/faq/base_convert.html
is
v_dec number;
v_hex varchar2(16) := '0123456789ABCDEF';
begin
v_dec := 0;
for indx in 1 .. length(p_hex_str) loop
v_dec := v_dec * 16 + instr(v_hex, upper(substr(p_hex_str, indx, 1))) - 1;
end loop;
return v_dec;
end hex_to_decimal;
|
| csv parsing
|
+-----------------------------------------------*/
FUNCTION parse_csv_to_imp_table(in_import_id in number) RETURN boolean IS
PRAGMA autonomous_transaction;
v_blob_data BLOB;
n_blob_len NUMBER;
v_entity_name VARCHAR2(100);
n_skip_rows INTEGER;
n_columns INTEGER;
n_col INTEGER := 0;
n_position NUMBER;
v_raw_chunk RAW(10000);
v_char CHAR(1);
c_chunk_len number := 1;
v_line VARCHAR2(32767) := NULL;
n_rows number := 0;
n_temp number;
BEGIN
-- shortened
n_blob_len := dbms_lob.getlength(v_blob_data);
n_position := 1;
-- Read and convert binary to char
WHILE (n_position <= n_blob_len) LOOP
v_raw_chunk := dbms_lob.substr(v_blob_data, c_chunk_len, n_position);
v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
n_temp := ascii(v_char);
n_position := n_position + c_chunk_len;
-- When a whole line is retrieved
IF v_char = CHR(10) THEN
n_rows := n_rows + 1;
if n_rows > n_skip_rows then
-- Shortened
-- Perform some action with the line (store into table etc.)
end if;
-- Clear out
v_line := NULL;
n_col := 0;
ELSIF v_char != chr(10) and v_char != chr(13) THEN
v_line := v_line || v_char;
if v_char = ';' then
n_col := n_col+1;
end if;
END IF;
END LOOP;
COMMIT;
return true;
EXCEPTION
-- some exception handling
END;Uploading CSV files into LOB columns and then reading them in PL/SQL: [It’s|http://forums.oracle.com/forums/thread.jspa?messageID=3454184�] Re: Reading a Blob (CSV file) and displaying the contents Re: Associative Array and Blob Number of rows in a clob doncha know.
Anyway, it woudl help if you gave us some basic information: database version and NLS settings would seem particularly relevant here.
Cheers, APC
blog: http://radiofreetooting.blogspot.com -
Parsing a csv file with carriage return replaced with #
Hi,
We have a weird problem. We are able to download a csv file using standard FM HTTP_GET. We want to parse the file and upload the data into our SAP CRM system. However, the file downloaded, has the carriage return replaced and the character # replaces it and everything seems like its one line.
I understand that the system replaces the Carriage return with the charater #. My question is, if I try to pass this file into my program to parse for the data, will there be any issues in the system recognizing that "#" that it is a carriage return and that the data in the file is not 1 record but multiple records?Hi
'#' is what you see in the SAP. But the actuall ascii associated will be of carraige return itself. So to identify if you have multiple records of not don't use hard coded '#' but instead use the constant CL_ABAP_CHAR_UTILITIES=>CR_LF.
Regards
Ranganath -
Reading/Parsing a CSV file in UTF-16 ?
Hello everyone,
I'm in rush to modify my current CSV file parser that works fine for files in UTF-8 , to be able to parse the UTF-16 as well, as far as I checked the sample plugins, didn't find any code,
Also how could have support for both encodings? to do this I need to recognize the encoding by reading the file first then decide how to read from stream, any advice/ snippet will greatly appreciated.
P.S. I'm using this code to read a file
stream = StreamUtils::CreateFileStreamRead()
stream ->XferByte(aChar) // in a loop till find a eol char
I need to use to read the 2 bytes, i had some experiment with XferInt16 but seems it doesn't do what i want...
Regards,
KamranI had forgotten to skip the first two bytes in this case, Now I can read the file properly with XferInt16, also you may consider Byte Swapping for BigEndian in parsing process.
-Kamran -
Parsing "," in CSV file using select command.
How to handle comma "," in an sql Select Query, Supposing that i have huge chunk of message that contains different special characters like "," double quotes, [] etc etc. when im exporting this message into CSV the whole thing is getting
disturbed and displayed in various line or columns.
I tried google this but i couldnt get any answer. Can somebody please help me to get away with this.You can text qualify it by enclosing the whole contents within "" so that it would regard , etc as part of the data rather than as a column delimiter.
See how you do it in SSIS
http://visakhm.blogspot.in/2014/07/ssis-tips-handling-inconsistent-text.html
http://visakhm.blogspot.in/2014/06/ssis-tips-handling-embedded-text.html
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
How to upload data into a database from a csv file in a jsp app?
I can write a HTML form to let users to post a csv file and store it in the web server, then how could I process the file to load the data into a View Obj or an Entity Obj without manually parsing the csv files? Any jsp or java code samples that will do something like the "sqlload" in sqlplus?
I'm using JDev 3.1.1.2. Thanks.Navy_Coder wrote:
6 zebras.But you must marry his eldest daughter as well, for that dowry. -
Hi all,
I'm new to flex and have been asked to provide a small widget
that will display the contents of a csv in a list. Can I use the
HTTPService to open a csv file and then create an actionscript 3
function to parse it? If so can anyone point me to a tutorial on
how to parse a csv file please.
Thanks in advanceHello Sir,
I am new to flex and want to read a csv file and using below code but does not seems to be working. Can you please help?
i use below code but did not work
var file:File = evt.currentTarget as File;
file = file.resolvePath(file.nativePath);
var fileStream:FileStream = new FileStream();
fileStream.open(file, FileMode.READ);
var fileData:String = fileStream.readUTFBytes(fileStream.bytesAvailable);
var endings:Array = [File.lineEnding, "\n", "\r"];
but for some reason it return "ÐÏ ࡱá" funny value. Any idea why don't i get the correct data from file.
belwo is the csv file i am trying to open.
Title
Given_Name
Surname
Gong
Salutation
Position
Organisation
Address_Line_1
Address_Line_2
Address_Line_3
Suburb
State
Postcode
Country
Home_Phone
Fax
Other_Phone
User_Field_1
User_Field_2
User_Field_3
User_Field_4
Mobile_Phone
Second_Address
Second_Address_Line_1
Second_Address_Line_2
Second_Address_Line_3
Second_Suburb
Second_State
Second_Country
Second_Postcode
Langcode
Website
Mr.
Jeff
Alexander
Retention Marketing
Monday, April 13th, 2009
Mr.
Anthony
Demaso
Retention Marketing
Monday, April 13th, 2009
Sally
Swinamer
Yield
Monday, April 13th, 2009
Chris
Torbay
Yield
Monday, April 13th, 2009
Annette
Warring
Genesis Vizeum
Monday, April 13th, 2009
Mr.
Mark
Khoury
Genesis Vizeum
Monday, April 13th, 2009
Mr.
Andy
Thorndyke
Thorsons
Monday, April 13th, 2009
Shannon
Rutherford
Central Reproductions
Monday, April 13th, 2009
Mr.
Rob
Greenwood
Central Reproductions
Monday, April 13th, 2009
Lisa
Marchese
Des Rosiers
Monday, April 13th, 2009
Mr.
Michael
Whitcombe
McMillan LLP
Monday, April 13th, 2009
Thanks,
Gill -
Hi,any suggestions welcome.I have a page that gets a .CSV file for user names from the local system and sends it a servlet to parse and send to the database.I inetnd to use a servlet to parse the csv files and insert into the database.Is there any easier method of implementation,or can i use that method.
Thanks in advanceDatabases usually have utilities which read CSV and upload data into the database. These are faster than JDBC, but that would mean using Runtime.exec(). I guess you will have to go for parsing the files and inserting the records in batches.
-
Reading csv file how to get the Column name
Hi,
I am trying to read a csv file and then save the data to Oracle.
Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
Connection c = DriverManager.getConnection("jdbc:odbc:;Driver={Microsoft Text Driver (*.txt; *.csv)};Dbq=.;Extensions=csv,txn");
Statement stmt = c.createStatement();
ResultSet rs = stmt.executeQuery("select * from filename.csv");
while(rs.next())
System.out.println(rs.getString("Num"));
My csv file looks like this:
"CHAM-23","COMPANY NAME","Test","12",20031213,15,16
Number,Environ,Envel,Date,Time
"1","2",3,"4",5
"6","7",8,"9",9
Now is there anyway using the above code I start processing the file from the second row that holds the names of the columns and skip the first row. And also can I get the name of the column using ResultSet something like:
if columnName.equals("Number")
Because I may have a csv file that could have more columns:
"CHAM-24","COMPANY NAME","Test","12",20031213,16,76
Number,Environ,Envel,Date,Time,Total,Count
"1","2","3","4","5",3,9
"6","7","8","9",9",,2
So I want to get the column name and then based on that column I do some other processing.
Once I read the value of each row I want to save the data to an Oracle Table. How do I connect to Oracle from my Application. As the database is on the server. Any help is really appreciated. ThanksThe only thing I could think of (and this is a cluj) would be to attempt to parse the first element of each row as a number. If it fails, you do not have a column name row. You are counting on the fact that you will never have a column name that is a number in the first position.
However, I agree that not always placing the headers in the same location is asking for trouble. If you have control over the file, format it how you want. If someone else has control over the file, find out why they are doing it that way. Maybe there is a "magic" number in the first row telling you where to jump.
Also, I would not use ODBC to simply parse a CSV file. If the file is formatted identically to Microsoft's format (headers in first row, all subsequent rows have same number of columns), then it's fine to take a shortcut and not write your own parser. But if the file is not adhering to that format, don't both using the M$ ODBC driver.
- Saish
"My karma ran over your dogma." - Anon -
.csv file in an outlook email attachment
Hello,
How can we accomplish following using SOA/BPEL without using Oracle Beehive or any other third paty tools.
1. Emails contain a .csv file with 5 columns.
2. There is a table in Oracle EBS database with 5 columns.
3. We need to have real time integration to import these .csv file from incoming emails into Oracle table from #2 above.
How can SOA/BPEL detach the .csv file from emails, parse .csv fields and import these files in Oracle table in real time manner?
Please advise.
Darshas idea
- vba script in outlook for save .*csv files to some directory
- composite for pooling directory and insert data from parsed .*csv files
or
all by vba -
Loading a CSV file into a table as in dataworkshop
Data Workshop has a feature to load a CSV file and create a table based on it, similarly i want to create it in my application.
i have gone through the forum http://forums.oracle.com/forums/thread.jspa?threadID=334988&start=60&tstart=0
but could not download all the files(application , HTMLDB_TOOLS package and the PAGE_SENTRY function) couldnt find the PAGE_SENTRY function.
AND when i open this link http://apex.oracle.com/pls/apex/f?p=27746
I couldn't run the application. I provided a CSV file and when I click SUBMIT, I got the error:
ORA-06550: line 1, column 7: PLS-00201: identifier 'HTMLDB_TOOLS.PARSE_FILE' must be declared
tried it in apex.oracle.com host as given in the previous post.
any help pls..,
any other method to load data into tables.., (similar to dataworkshop)hi jari,
I'm using the HTMDB_TOOLS to parse my csv files,It works well for creating a report, but if i want to create a new table and upload the data giving error *"missing right parenthesis"*
I've been looking through the package body and i think there is some problem in the code,
IF (p_table_name is not null)
THEN
BEGIN
execute immediate 'drop table '||p_table_name;
EXCEPTION
WHEN OTHERS THEN NULL;
END;
l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
htmldb_util.set_session_state('P149_DEBUG',l_ddl);
execute immediate l_ddl;
l_ddl := 'insert into '||p_table_name||' '||
'select '||v(p_columns_item)||' '||
'from htmldb_collections '||
'where seq_id > 1 and collection_name='''||p_collection_name||'''';
htmldb_util.set_session_state('P149_DEBUG',v('P149_DEBUG')||'/'||l_ddl);
execute immediate l_ddl;
RETURN;
END IF;it is droping table but not creating new table.
P149_DEBUG contains create table EMP_D (Emp ID number(10),Name varchar2(20),Type varchar2(20),Join Date varchar2(20),Location varchar2(20))and if i comment the
BEGIN
execute immediate 'drop table '||p_table_name;
EXCEPTION
WHEN OTHERS THEN NULL;
END;
l_ddl := 'create table '||p_table_name||' '||v(p_ddl_item);
htmldb_util.set_session_state('P149_DEBUG',l_ddl);
execute immediate l_ddl;then it is working well, i.e i enter the exsisting table name then i works fine, inserting the rows.
but unable to create new table
can you pls help to fix it.
Regards,
Little Foot -
Hello everyone,
it is my first post here.
I'm trying to import a CSV file into Numbers, but the result is not quite what I was looking for. First of all, to get Numbers to open the file I had to drag and drop it into Numbers, I didn't find any other way to do it. I was looking for an "import" button that would let me give some kind of instructions on how to open the file, but did not find this button.
In the CSV file, every numbers is separated by a comma, wich is normal, but the decimal portion of a value is also indicated by a comma and when imported it ends up in it's own column. So 4,5 might be 2 columns, the first one with the value 4 and the second one with the value 5, but 4,5 could also be the value 4,5. How do we tell Numbers how it should interpret the different values in a CSV file. Is it possible to say that column 4 is in fact the decimal part of value in column 3?
Hopefully this is clear enough for you to understand.
Thank you for your help.Hello
The support of CSV files was described here many times.
If the decimal separator in use on the system is the period, Numbers requires the original CSV format : Comma Separated Values.
If the decimal separator in use on the system is the comma, Numbers requires the alternate CSV format : Semi-colon Separated Values.
Given what you wrote, you are trying to import datas from a file using the original CSV format and the decimal comma.
I know that some experts wrote Python filters to treat such case but I have no reference available at this time.
If you send a sample file to my mailbox, I will try to write an AppleScript deciphering it.
But as I often wrote, CSV is really the worst format ever invented. Why aren't you asking the document's author to use Tab Separated Values format ?
Click my blue name to get my address.
Yvan KOENIG (VALLAURIS, France) vendredi 22 avril 2011 11:04:21 -
Can I use only split for parsing a csv file with commas?
No.
It's more complicated than that.
Use a CSV parsing library. Apache Jakarta has one.
Edited by: paulcw on Mar 23, 2008 8:36 PM -
Hi buddies,
I want to read the values of csv file in XML. Can anyone suggest me.. how to do so?
bye
its reejuHallo,
it is really quite easy! Our firm has developed a proprietary technology for a life assurance application which is configurable and adapts to changing requirements in the mapping between the CSV format and the XML. Unfortunately, I cannot release the technology to you, unless you wish to purchase it, or purchase consultancy from us.
I can however give you a couple of hints.
The most important starting position is to identify on paper how you want to map the fields in your csv-file to XML elements. You need to prepare a schema for the XML file, to document its structure. When you have done this, then you can start to implement the necessary solution.
Then you need to parse your CSV-file, to identify the fields in each line. It is often helpful if the first line of the csv-file contains the field names as a check, but it is not 100% necessary. You could, for example, store the fields in a Vector, or with associated field names in a Hashtable. You could build a Hashtable which maps csv-fields to XML-element-names. You then go through the individual fields and build a DOM-Tree, inserting the associated XML-elements in the correct positions. It is then a simple matter to transform your DOM-Tree to an XML file. For speed of implementation, we normally use XALAN with an xslt script, so that the XML looks reasonably formatted in the file.
Good luck
David Singleton
[email protected] -
Uploading CSV file to DB table using APEX
HI,
Can anyone guide me how to upload a .csv file to a DB table.
THANKS IN ADVANCEuser12550902 wrote:
as i got some information from the otn..i got a pl/sql code which is used in the process.
i used it in browse item .selecting file in the client disk and submitting it.when i press the button submit ,it calls a process.It is complex to parse a CSV file manually - as your code is doing. Using an external table (and having Oracle parse it for you), is a lot simpler.
My guess is that your parsing code fails to deal with all the data formatting cases for that CSV file - and thus the run-time error and failure.
I suggest that the very first thing you consider is modularisation. How can one test the parsing code in your source? It is totally embedded in the code. A very unflexible approach in all aspects - from code design to maintenance, troubleshooting and debugging. It also does not allow the CSV parsing code to be re-used.
It comes down to a very fundamental software engineering principle - modularise your code. Always.
Here's an example of creating a CSV parser - using a custom SQL data type to store the "tokens" from the CSV file, and a function to return it.
SQL> create or replace type TStrings is table of varchar2(4000);
2 /
Type created.
SQL>
SQL> create or replace function tokenise( line varchar2, separatorChars varchar2 default ',', enclosedBy varchar2 default null ) return TStrings is
2 strList TStrings;
3 str varchar2(4000);
4 i integer;
5 l integer;
6 enclose1 integer;
7 enclose2 integer;
8 encloseStr varchar2(4000);
9 replaceStr varchar2(4000);
10
11 procedure AddString( cLine varchar2 ) is
12 begin
13 strList.Extend(1);
14 strList( strList.Count ) := REPLACE( cLine, CHR(0), separatorChars );
15 end;
16
17 begin
18 strList := new TStrings();
19
20 str := line;
21 loop
22 if enclosedBy is not null then
23 -- find the enclosed text, if any
24 enclose1 := INSTR( str, enclosedBy, 1 );
25 enclose2 := INSTR( str, enclosedBy, 2 );
26
27 if (enclose1 > 0) and (enclose2 > 0) and (enclose2 > enclose1) then
28 -- extract the enclosed string
29 encloseStr := SUBSTR( str, enclose1, enclose2-enclose1+1 );
30 -- replace the separator char's with zero char's
31 replaceStr := REPLACE( encloseStr, separatorChars, CHR(0) );
32 -- and remove the enclosed quotes
33 replaceStr := REPLACE( replaceStr, enclosedBy );
34 -- change the enclosed string in the big string to the replacement string
35 str := REPLACE( str, encloseStr, replaceStr );
36 end if;
37 end if;
38
39 l := LENGTH( str );
40 i := INSTR( str, separatorChars );
41
42 if i = 0 then
43 AddString( str );
44 else
45 AddString( SUBSTR( str, 1, i-1 ) );
46 str := SUBSTR( str, i+1 );
47 end if;
48
49 -- if the separator was on the last char of the line, there is
50 -- a trailing null column which we need to add manually
51 if i = l then
52 AddString( null );
53 end if;
54
55 exit when str is NULL;
56 exit when i = 0;
57 end loop;
58
59 return( strList );
60 end;
61 /
Function created.
SQL> show errors
No errors.
SQL>
SQL> col TOKEN format a30
SQL>
SQL> -- default commas as separator, with a trailing null column
SQL> select rownum as TOKEN_ID, column_value as TOKEN from TABLE( tokenise('id,surname,date,1234,') );
TOKEN_ID TOKEN
1 id
2 surname
3 date
4 1234
5
SQL>
SQL> -- default commas as separator, with enclosed text, nulls, etc
SQL> select rownum as TOKEN_ID, column_value as TOKEN from TABLE( tokenise('id,surname,"What do you want, universe?",date,1234,,another column',',','"') );
TOKEN_ID TOKEN
1 id
2 surname
3 What do you want, universe?
4 date
5 1234
6
7 another column
7 rows selected.
SQL>
{code}
This approach enables you to write a single and comprehensive parser, test it using SQL and PL/SQL, use it in PL/SQL and SQL, and re-use the code where and when needed.
Maybe you are looking for
-
How to change users' initial page based on a condition
I'm using SSO on my application, and have two types of users who will be logging on - I want one set (admins) to go straight to one page, and the other set (users) to go to a another page after logging in. These two sets of users are stored in the da
-
Unable to Install Windows 8 on Virtual Machine
I have Windows 7 Ultimate 64 Bit installed and trying to install Windows 8 64 Bit using the ISO file on Microsoft Virtual Machine, receiving an error as below, tried to attach the screenshot but it is saying "Body text cannot contain images or links
-
How do I save an image in Photoshop
Format menu is not in my program!!!!!!
-
Problem start a BPM process, No ADF binding is defined for the service.
Hi, I am trying to use oracle.soa.management.facade.Service.request method to start a BPM process. The service works fine while being tested using Enterprise Manager. But while I am trying to invoke the service from a java client, exception occurs. T
-
ADF BC & Faces: Automating Builds
Hi all, With the tips I've been recieving in regarding SVN, I'm happy to say we have this up and running within our group. In my past experience at a different company developing open source j2EE desktop apps, we had a CVS repository hooked up with a