To read the csv file using collections
i am able to retrieve each row of the csv file which is in table format But in some columns the data entered is in two or three lines while retriving the row, the program is taking The second line of that column as the second row of that file
how could i display the entire row of that file in a single line.
i have done this program could u please help out in this program
package fileReading;
import java.io.BufferedReader;
import java.io.FileReader;
import java.util.*;
import java.util.Vector;
public class FileReading
public static void main(String args[]){
try{
String line;
int startInd,endInd;
char FieldSeparator = ',';
// List recordset;
Map recordset;
Vector record;
int recordNumber=0;
// recordset = new ArrayList();
recordset = new HashMap();
BufferedReader in = new BufferedReader(new FileReader("c:\\GEMSBBtest.csv"));
line = in.readLine();
line = in.readLine();
do{
int len = line.length();
record = new Vector();
// System.out.println(line);
for(startInd = -1, endInd = 0; endInd >=0; ){
endInd = line.indexOf(FieldSeparator,startInd+1 );
if (endInd < 1){
record.add(line.substring(startInd+1,len));
// System.out.println(line.substring(startInd+1,len));
else{
record.add(line.substring(startInd+1,endInd));
// System.out.println(line.substring(startInd+1,endInd));
startInd = endInd;
recordNumber++;
// recordset.add(record);
recordset.put(Integer.toString(recordNumber),record);
line = in.readLine();
if (recordNumber<3)
System.out.println(line);
}while(line!=null);
System.out.println(recordset.size());
//System.out.println(recordNumber);
}catch(Exception e){
System.out.println(e.toString());
}
No there is no such special characters.Actually the file which there in xls format is being converted to csv format
Now in each cell there r multiple lines and i need to print each row in a single line including that multiple line column or cell
Similar Messages
-
How to read the CSV Files into Database Table
Hi
Friends i have a Table called tblstudent this has the following fields Student ID, StudentName ,Class,Father_Name, Mother_Name.
Now i have a CSV File with 1500 records with all These Fields. Now in my Program there is a need for me to read all these Records into this Table tblstudent.
Note: I have got already 2000 records in the Table tblstudent now i would like to read all these CSV File records into the Table tblstudent.
Please give me some examples to do this
Thank your for your service
Cheers
Jofin1) Read the CSV file line by line using BufferedReader.
2) Convert each line (record) to a List and add it to a parent List. If you know the columns before, you might use a List of DTO's.
3) Finally save the two-dimensional List or the List of DTO's into the datatable using plain JDBC or any kind of ORM (Hibernate and so on).
This article contains some useful code snippets to parse a CSV: http://balusc.xs4all.nl/srv/dev-jep-csv.html -
How to read a .CSV file using UTL_FILE
HI,
How do i read a .csv file line by line using UTL_FILE?
Thanks in advance
Regards,
Gayatri----do open the file logic
begin
----Let's say this file is delimited by ','
---declare variables
v_startPos number; -- starting position of field
v_Pos number; -- position of string
v_lenString number; -- length
v_first_field varchar2(30);
v_second_field varchar2(30);
v_third_field varchar2(30);
v_fourth_field varchar2(30);
input_String varchar2(1000); -- buffer for each line of file
----Say you have a 4 column file delimited by ','
delimitChar varchar2(1) := ','
Joe;Joan;People;Animal
Teddy;Bear;Beans;Toys
begin
loop
utl_file.get_line(input_file, input_String); -- get each line
---- this will get the first field as specified by the last number
v_Pos := instr(input_String,delChar,1,1);
v_lenString := v_Pos - 1;
v_first_field := substr(input_String,1,v_lenString);
v_startPos := v_Pos + 1;
-- this will get the second field
v_Pos := instr(inString,delChar,1,2);
v_lenString := v_Pos - v_startPos;
v_second_field := substr(input_String,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- 3rd field
v_Pos := instr(inString,delChar,1,3);
v_lenString := v_Pos - v_startPos;
v_third_field := substr(input_String,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- last field -- there is no delimiter for last field
v_Pos := length(input_String) + 1;
v_lenString := v_Pos - v_startPos;
v_fourth_field := substr(input_String,v_StartPos,v_lenString);
end;
EXCEPTION
WHEN no_data_found then
fnd_file.put_line(FND_FILE.LOG, 'Last line so exit');
exit;
end loop; -
Hi,
I have data in csv file. I need to read the data from that file and at the same time i need to omit first row(ie column header). How can i do this...?
Please help me....asap
Thanks in advance....BluShadow wrote:
Nuerni wrote:
Just read-in the csv-file line by line and dismiss the first line; use a tokenizer for each line to get the semicolon-delimited field of every line; now fill your record/table or anything else with the tokens...
There's an tokenizer-example in this forum I've posted recently:
Re: Comma separated values to columns
That's a lot of hard work to achieve something that is built into SQL*Loader and External Tables. All that's needed is the SKIP <n> keyword included in the control definition for those.You're right!
If SQL*Loader and External Tables is an available option for SHAN01 then try use it; if not then I would consider the tokenizer-method... -
Can you please provide the .csv files used in Power BI Getting Started Guide
Hi Team,
I am exploring the Power BI app developed by Microsoft on My Windows 8 machine
For that I found the Getting started guide from the below link
http://office.microsoft.com/en-in/excel-help/power-bi-getting-started-guide-HA104103589.aspx?CTT=5&origin=HA102835634
Power Query and Power Pivot are totally new to me and I am trying to learn more on it
While trying to execute the steps given in the starting guide I found that it requires 2 csv files (NYSE Daily 2009 and same for NASDAQ) for mashing up data agaist S&P 500
I tried to get from web, but unfortunately not getting the required data
Can you please provide the test data (the 2 csv files NYSE Daily 2009 and NASDAQ daily 2009) using which I will resume my work in Power BI?
Thanks in advance
Rajendra
InfoCepts( Specialists in Onshore and Offshore BI)
Rajendra Kitukale InfoCepts(Specialists in Onsite and Offshore BI)Hi, all -- here are links to the CSV files.
NYSE sample data:
http://go.microsoft.com/fwlink/?LinkID=389692
NASDAQ sample data:
http://go.microsoft.com/fwlink/?LinkId=389693
Hope that helps!
Maggie Sparkman -
How to Read a CSV file using pure PL/SQL
Hi friends - Is there a possibility to read a .TXT/CSV file from a local machine and display the contents on the form fields each record at one time.
The logic to flow through the records can be build, but the problem is where to begin from to start reading the file using PL/SQL. I don't want to use any third party component/API as such. Is there any way in forms by using PL/SQL to read an external CSV file and interpret the contents.
I have read about SQL * Loader on some sites but cannot find any way to invoke it on windows platform. There seems to be UNIX commands to invoke the .CTL file that is used by SQL Loader. Any help is much apreciated.
RgdsHi Thanks for your replies, TEXT_IO seems to be a solution but not very comprehensive as it provides limited exposed functions to perform complex operations while read and write of files.
I think SQL*Loader is a valid solution, but as I stated in my original quote Im not able to invoke it from the command prompt. The command that is shown on the suggested URL(http://www.orafaq.com/faqloadr.htm) is not found on my machine. I have Windows 2K installed. Is there a seperate patch or a download available from where I can get that .EXE to invoke the utility.
Thanks.. -
How to read the excel file using webdynpro abap?
Hi,
how to read and modify excel file using webdynpro abap?
Regards,
PavaniFor reading excel file follow the steps :
1. Use a File upload UI element and bind it with xstring.
2. Now your excel will be uploaded and stored in Xstring.
3. Convert Xstring to String data using FM 'HR_KR_XSTRING_TO_STRING'.
4. Now split the string at new line so as to make an internal table .
Ex . SPLIT l_string AT cl_abap_char_utilities=>newline INTO TABLE it_table.
here it_table is type table of string.
5.now loop at the internal table and separate the content of this table separated by tab.
Ex. SPLIT wa_table AT cl_abap_char_utilities=>horizontal_tab INTO TABLE it_new.
it_new type string_table.
6. For more info , refer this thread :
Re: How to upload excel file in Webdynpro application using ABAP -
<?xml version="1.0" encoding="utf-8"?>
<mx:Application xmlns:mx="http://www.adobe.com/2006/mxml" layout="absolute" >
<mx:Script>
<![CDATA[
import mx.controls.*;
import mx.collections.*;
import flash.events.*;
import mx.collections.ArrayCollection
[Bindable]
private var records:Array=new Array();
[Bindable]
private var datarecords:ArrayCollection=new ArrayCollection();
private var xmldata:String="<?xml version=\"1.0\"?>\r\n<dataset>\r\n";
private var fileref:FileReference=new FileReference();;
private function readExcel():void
var request:URLRequest=new URLRequest();
request.url="data/chart.csv";
var loader:URLLoader=new URLLoader();
loader.dataFormat=URLLoaderDataFormat.TEXT;
loader.addEventListener(Event.COMPLETE,eventComplete);
loader.addEventListener(IOErrorEvent.IO_ERROR,onIOError);
loader.load(request);
private function eventComplete(event:Event):void
var loader:URLLoader=URLLoader(event.target);
var record:Array=new Array();
var fields:Array=new Array();
var obj:Object;
var str:String=new String();
loader.dataFormat=URLLoaderDataFormat.TEXT;
var result:String=new String(loader.data);
record=result.split("\r\n");
for(var i:int=1;i<record.length;i++)
obj=new Object();
fields=record[i].split(",");
/* obj.col1="Timestamp: "+fields[0];
obj.col2="EndDevice-PaLnaMode: "+fields[1];
obj.col3="Wap-PaLnaMode: "+fields[2];
obj.col4="DownstreamLqi: "+fields[3];
obj.col5="UpstreamLqi: "+fields[4]; */
obj.col1="<Timestamp>"+fields[0]+"</Timestamp>";
xmldata+=obj.col1+"\r\n";
obj.col2="<EndDevice-PaLnaMode>"+fields[1]+"</EndDevice-PaLnaMode>";
xmldata+=obj.col2+"\r\n";;
obj.col3="<Wap-PaLnaMode>"+fields[2]+"</Wap-PaLnaMode>";
xmldata+=obj.col3+"\r\n";
obj.col4="<DownstreamLqi>"+fields[3]+"</DownstreamLqi>";
xmldata+=obj.col4+"\r\n";;
obj.col5="<UpstreamLqi>"+fields[4]+"</UpstreamLqi>";
records.push(obj);
datarecords.addItem(obj);
xmldata+="</dataset>";
datagrid.dataProvider=records;
linechart1.dataProvider=records;
private function onIOError(event:Event):void
Alert.show("I/O error"+event.type);
private function saveXML():void
fileref.save(xmldata,"xmldata.xml");
//fileref.save(xmldata,"NewFileName.txt");
private function dispData():void
fileContents_txt.text=xmldata;
]]>
</mx:Script>
<mx:DataGrid id="datagrid" x="19" y="76" width="528" height="242">
<mx:columns>
<mx:DataGridColumn headerText="Timestamp" dataField="col1"/>
<mx:DataGridColumn headerText="DevicePaLnaMode" dataField="col2"/>
<mx:DataGridColumn headerText="Wap-PaLnaMode" dataField="col3"/>
<mx:DataGridColumn headerText="DownstreamLqi" dataField="col4"/>
<mx:DataGridColumn headerText="UpstreamLqi" dataField="col5"/>
</mx:columns>
</mx:DataGrid>
<mx:Button x="126" y="32" label="read" click="readExcel()"/>
<mx:Button x="240" y="32" label="display data" click="dispData();"/>
<mx:Text id="fileContents_txt" x="10" y="326"/>
<!-- Define custom Strokes. -->
<mx:Stroke id = "s1" color="blue" weight="2"/>
<mx:LineChart id="linechart1" height="286" width="385"
paddingLeft="5" paddingRight="5"
showDataTips="true" dataProvider="{datarecords}" x="566" y="32">
<mx:horizontalAxis>
<mx:CategoryAxis categoryField="Timestamp"/>
</mx:horizontalAxis>
<mx:series>
<mx:LineSeries yField="Timestamp" xField="UpstreamLqi" interpolateValues="true" form="curve" displayName="Timestamp" lineStroke="{s1}"/>
</mx:series>
</mx:LineChart>
<mx:Button x="406" y="32" label="SaveXML" click="saveXML()"/>
</mx:Application>Um, wrong forum.
I think you want to ask this question in the Flex forum. -
Reading csv file using file adapter
Hi,
I am working on SOA 11g. I am reading a csv file using a file adapter. Below are the file contents, and the xsd which gets generated by the Jdev.
.csv file:
empid,empname,empsal
100,Ram,20000
101,Shyam,25000
xsd generated by the Jdev:
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd" xmlns:tns="http://TargetNamespace.com/EmpRead" targetNamespace="http://TargetNamespace.com/EmpRead" elementFormDefault="qualified" attributeFormDefault="unqualified"
nxsd:version="NXSD"
nxsd:stream="chars"
nxsd:encoding="ASCII"
nxsd:hasHeader="true"
nxsd:headerLines="1"
nxsd:headerLinesTerminatedBy="${eol}">
<xsd:element name="Root-Element">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="Child-Element" minOccurs="1" maxOccurs="unbounded">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="empid" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy=""" />
<xsd:element name="empname" minOccurs="1" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy=""" />
<xsd:element name="empsal" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" nxsd:quotedBy=""" />
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:schema>
For empname i have added minoccurs=1. Now when i remove the empname column, the csv file still gets read from the server, without giving any error.
Now, i created the following xml file, and read it through the file adapter:
<?xml version="1.0" encoding="UTF-8" ?>
<Root-Element xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://TargetNamespace.com/EmpRead xsd/EmpXML.xsd" xmlns="http://TargetNamespace.com/EmpRead">
<Child-Element>
<empid>100</empid>
<empname></empname>
<empsal>20000</empsal>
</Child-Element>
<Child-Element>
<empid>101</empid>
<empname>Shyam</empname>
<empsal>25000</empsal>
</Child-Element>
</Root-Element>
When i removed the value of empname, it throws the proper error for the above xml.
Please tell me why the behaviour of file adapter is different for the csv file and the xml file for the above case.
ThanksHi,
I am working on SOA 11g. I am reading a csv file using a file adapter. Below are the file contents, and the xsd which gets generated by the Jdev.
.csv file:
empid,empname,empsal
100,Ram,20000
101,Shyam,25000
xsd generated by the Jdev:
<?xml version="1.0" encoding="UTF-8" ?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd" xmlns:tns="http://TargetNamespace.com/EmpRead" targetNamespace="http://TargetNamespace.com/EmpRead" elementFormDefault="qualified" attributeFormDefault="unqualified"
nxsd:version="NXSD"
nxsd:stream="chars"
nxsd:encoding="ASCII"
nxsd:hasHeader="true"
nxsd:headerLines="1"
nxsd:headerLinesTerminatedBy="${eol}">
<xsd:element name="Root-Element">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="Child-Element" minOccurs="1" maxOccurs="unbounded">
<xsd:complexType>
<xsd:sequence>
<xsd:element name="empid" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy=""" />
<xsd:element name="empname" minOccurs="1" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="," nxsd:quotedBy=""" />
<xsd:element name="empsal" type="xsd:string" nxsd:style="terminated" nxsd:terminatedBy="${eol}" nxsd:quotedBy=""" />
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:schema>
For empname i have added minoccurs=1. Now when i remove the empname column, the csv file still gets read from the server, without giving any error.
Now, i created the following xml file, and read it through the file adapter:
<?xml version="1.0" encoding="UTF-8" ?>
<Root-Element xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://TargetNamespace.com/EmpRead xsd/EmpXML.xsd" xmlns="http://TargetNamespace.com/EmpRead">
<Child-Element>
<empid>100</empid>
<empname></empname>
<empsal>20000</empsal>
</Child-Element>
<Child-Element>
<empid>101</empid>
<empname>Shyam</empname>
<empsal>25000</empsal>
</Child-Element>
</Root-Element>
When i removed the value of empname, it throws the proper error for the above xml.
Please tell me why the behaviour of file adapter is different for the csv file and the xml file for the above case.
Thanks -
Read a CSV file and dynamically generate the insert
I have a requirement where there are multiple csv's which needs to be exported to a sql table. So far, I am able to read the csv file and generate the insert statement dynamically for selected columns however, the insert statement when passed as a parameter
to the $cmd.CommandText
does not evaluate the values
How to evaluate the string in powershell
Import-Csv -Path $FileName.FullName | % {
# Insert statement.
$insert = "INSERT INTO $Tablename ($ReqColumns) Values ('"
$valCols='';
$DataCols='';
$lists = $ReqColumns.split(",");
foreach($l in $lists)
$valCols= $valCols + '$($_.'+$l+')'','''
#Generate the values statement
$DataCols=($DataCols+$valCols+')').replace(",')","");
$insertStr =@("INSERT INTO $Tablename ($ReqColumns) Values ('$($DataCols))")
#The above statement generate the following insert statement
#INSERT INTO TMP_APPLE_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )
$cmd.CommandText = $insertStr #does not evaluate the values
#If the same statement is passed as below then it execute successfully
#$cmd.CommandText = "INSERT INTO TMP_APL_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )"
#Execute Query
$cmd.ExecuteNonQuery() | Out-Null
jyeragiHi Jyeragi,
To convert the data to the SQL table format, please try this function out-sql:
out-sql Powershell function - export pipeline contents to a new SQL Server table
If I have any misunderstanding, please let me know.
If you have any feedback on our support, please click here.
Best Regards,
Anna
TechNet Community Support -
How to read the attribute of the xml file using jaxb
Thanks,
Buddy as i have a issue i have to read the xml file using jaxb and xml file contains this data and i have read the attribute like name , desc and action for a particular menu name pls tell the code how to do this it will be a great favour to me
thanx in advance
Rasool
<contextmenu>
<menu name='Lead' >
<menuitem name='newlead' desc='New Lead' action='/leads.do?dispatch=insert' />
<menuitem name='editlead' desc='Edit Lead' action='' />
<menuitem name='leadinfo' desc='Lead Information' action='' />
</menu>
<menu name='Cases' >
<menuitem name='' desc='' action='' />
<menuitem name='' desc='' action='' />
<menuitem name='' desc='' action='' />
</menu>
<menu name='Contact' >
<menuitem name='' desc='' action='' />
<menuitem name='' desc='' action='' />
<menuitem name='' desc='' action='' />
</menu>
</contextmenu>What my program do is to get the encoding of XML files and convert them to UTF-8 encoding files, while I need this "encoding" information of the original XML document thus I can convert...
After reading specifications and JDOM docs, the truth turns to be disappointed, no function is provided to get this information in JDOM level 2(the current released one), while it's promissed that this function will be provided in JDOM level API....
Thanx all for your help and attention!!! -
Read an avi file using "Read from binary file" vi
My question is how to read an avi file using "Read from binary file" vi .
My objective is to create a series of small avi files by using IMAQ AVI write frame with mpeg-4 codec of 2 second long (so 40 frames in each file with 20 fps ) and then send them one by one so as to create a stream of video. The image are grabbed from USB camera. If I read those frames using IMAQ AVI read frame then compression advantage would be lost so I want to read the whole file itself.
I read the avi file using "Read from binary file" with unsigned 8 bit data format and then sent to remote end and save it and then display it, however it didnt work. I later found that if I read an image file using "Read from binary file" with unsigned 8 bit data format and save it in local computer itself , the format would be changed and it would be unrecognizable. Am I doing wrong by reading the file in unsined 8 bit integer format or should I have used any other data types.
I am using Labview 8.5 and Labview vision development module and vision acquisition module 8.5
Your help would be highly appreciated.
Thank you.
Solved!
Go to Solution.
Attachments:
read avi file in other data format.JPG 26 KBHello,
Check out the (full) help message for "write to binary file"
The "prepend array or string size" input defaults to true, so in your example the data written to the file will have array size information added at the beginning and your output file will be (four bytes) longer than your input file. Wire a False constant to "prepend array or string size" to prevent this happening.
Rod.
Message Edited by Rod on 10-14-2008 02:43 PM -
Data formatting and reading a CSV file without using Sqlloader
I am reading a csv file to an Oracle table called sps_dataload. The table is structured based on the record type of the data at the beginning of
each record in the csv file. But the first two lines of the file are not going to be loaded to the table due to the format.
Question # 1:
How can I skip reading the first two lines from my csv file?
Question # 2:
There are more fields in the csv file than there are number of columns in my table. I know I can add filler as an option, but then there are
about 150 odd fields which are comma-separated in the file and my table has 8 columns to load from the file. So, do I really have to use filler
for 140 times in my script or, there is a better way to do this?
Question # 3:
This is more of an extension of my question above. The csv file has fields with block quotes - I know this could be achieved in sql loader when we mention Occassionally enclosed by '"'.
But can this be doable in the insert as created in the below code?
I am trying to find the "wrap code" button in my post, but do not see it.
Heres my file layout -
PROSPACE SCHEMATIC FILE
; Version 2007.7.1
Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1
This is my table structure -
desc sps_dataload;
File_Name Varchar2 (50) Not Null,
Record_Layer Varchar2 (20) Not Null,
Level_Id Varchar2 (20),
Desc1 Varchar2 (50),
Desc2 Varchar2 (50),
Desc3 Varchar2 (50),
Desc4 Varchar2 (50)
Heres my code to do this -
create or replace procedure insert_spsdataloader(p_filepath IN varchar2,
p_filename IN varchar2,
p_Totalinserted IN OUT number) as
v_filename varchar2(30) := p_filename;
v_filehandle UTL_FILE.FILE_TYPE;
v_startPos number; --starting position of a field
v_Pos number; --position of string
v_lenstring number; --length of string
v_record_layer varchar2(20);
v_level_id varchar2(20) := 0;
v_desc1 varchar2(50);
v_desc2 varchar2(50);
v_desc3 varchar2(50);
v_desc4 varchar2(50);
v_input_buffer varchar2(1200);
v_delChar varchar2(1) := ','
v_str varchar2(255);
BEGIN
v_Filehandle :=utl_file.fopen(p_filepath, p_filename, 'r');
p_Totalinserted := 0;
LOOP
BEGIN
UTL_FILE.GET_LINE(v_filehandle,v_input_buffer);
EXCEPTION
WHEN NO_DATA_FOUND THEN
EXIT;
END;
-- this will read the 1st field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,1);
v_lenString := v_Pos - 1;
v_record_layer := substr(v_input_buffer,1,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 2nd field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,2);
v_lenString := v_Pos - v_startPos;
v_desc1 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 3rd field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,3);
v_lenString := v_Pos - v_startPos;
v_desc2 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 4th field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,4);
v_lenString := v_Pos - v_startPos;
v_desc3 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 5th field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,5);
v_lenString := v_Pos - v_startPos;
v_desc4 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
v_str := 'insert into table sps_dataload values('||v_filename||','||v_record_layer||','||v_level_id||','||v_desc1||','||v_desc2||','||v_desc3||','||v_desc4||')';
Execute immediate v_str;
p_Totalinserted := p_Totalinserted + 1;
commit;
END LOOP;
UTL_FILE.FCLOSE(v_filehandle);
EXCEPTION
WHEN UTL_FILE.INVALID_OPERATION THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20051, 'sps_dataload: Invalid Operation');
WHEN UTL_FILE.INVALID_FILEHANDLE THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20052, 'sps_dataload: Invalid File Handle');
WHEN UTL_FILE.READ_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20053, 'sps_dataload: Read Error');
WHEN UTL_FILE.INVALID_PATH THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20054, 'sps_dataload: Invalid Path');
WHEN UTL_FILE.INVALID_MODE THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20055, 'sps_dataload: Invalid Mode');
WHEN UTL_FILE.INTERNAL_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20056, 'sps_dataload: Internal Error');
WHEN VALUE_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20057, 'sps_dataload: Value Error');
WHEN OTHERS THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE;
END insert_spsdataloader;
/Justin, thanks. I did happen to change my pl sql procedure using utl_file.get_file and modifying the instr function based on position of ',' in the file, but my procedure is getting really big and too complex to debug. So I got motivated to use external tables or sql loader as plan b.
As I was reading more about creating an external table as an efficient way and thus believe I can perhaps build an extern table with my varying selection from the file. But I am still unclear if I can construct my external table by choosing different fields in a record based on a record identifier string value (which is the first field of any record). I guess I can, but I am looking for the construct as to how am I going to use the instr function for selecting the field from the file while creating the table.
PROSPACE SCHEMATIC FILE
; Version 2007.7.1
Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1For example, if I want to create an external table like this -
CREATE TABLE extern_sps_dataload
( record_layer VARCHAR2(20),
attr1 VARCHAR2(20),
attr2 VARCHAR2(20),
attr3 VARCHAR2(20),
attr4 VARCHAR2(20)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY dataload
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE dataload:'sps_dataload.bad'
LOGFILE dataload:'sps_dataload.log'
DISCARDFILE dataload:'sps_dataload.dis'
SKIP 2
VARIABLE 2 FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' LRTRIM
MISSING FIELD VALUES ARE NULL
+LOAD WHEN RECORD_LAYER = 'PROJECT' (FIELD2, FIELD3,FIELD7,FIELD9)+
+LOAD WHEN RECORD_LAYER= 'PRODUCT' (FIELD3,FIELD4,FIELD8,FIELD9)+
+LOAD WHEN RECORD_LAYER= 'SEGMENT' (FIELD1,FIELD2,FIELD4,FIELD5)+ LOCATION ('sps_dataload.csv')
REJECT LIMIT UNLIMITED;
{code}
While I was reading the external table documentation, I thought I could achieve similar things by using position_spec option, but I am not getting behind its parameters. I have highlighted italics in the code above(from LOAD WHEN....FIELDS....), the part I think I am going to use, but not sure of it's construct.
Thank you for your help!! Appreciate your thoughts on this..
Sanders. -
How to read a .csv file(excel format) using Java.
Hi Everybody,
I need to read a .csv file(excel) and store all the columns and rows in 2d arrays. Then I can do the rest of the coding myself. I would like it if somebody could post their code to read .csv files over here. The .csv file can have different number of columns and different number of rows every time it is ran. The .csv file is in excel format, so I don't know if that affects the code or not. I would also appreciate it if the classes imported are posted too. I would also like to know if there is a way I can recognize how many rows and columns the .csv file has. I need this urgently so I would be very grateful to anybody who has the solution. Thanks.
Sincerely Taufiq.I used this
BufferedReader in = new BufferedReader (new FileReader ("test.csv"));
// and
StringTokenizer parser = new StringTokenizer (str, ", ");
while (parser.hasMoreTokens () == true)
{ //crap }works like a charm! -
How to Compare 2 CSV file and store the result to 3rd csv file using PowerShell script?
I want to do the below task using powershell script only.
I have 2 csv files and I want to compare those two files and I want to store the comparision result to 3rd csv file. Please look at the follwingsnap:
This image is csv file only.
Could you please any one help me.
Thanks in advance.
By
A Path finder
JoSwa
If a post answers your question, please click "Mark As Answer" on that post and "Mark as Helpful"
Best Online JournalNot certain this is what you're after, but this :
#import the contents of both csv files
$dbexcel=import-csv c:\dbexcel.csv
$liveexcel=import-csv C:\liveexcel.csv
#prepare the output csv and create the headers
$outputexcel="c:\outputexcel.csv"
$outputline="Name,Connection Status,Version,DbExcel,LiveExcel"
$outputline | out-file $outputexcel
#Loop through each record based on the number of records (assuming equal number in both files)
for ($i=0; $i -le $dbexcel.Length-1;$i++)
# Assign the yes / null values to equal the word equivalent
if ($dbexcel.isavail[$i] -eq "yes") {$dbavail="Available"} else {$dbavail="Unavailable"}
if ($liveexcel.isavail[$i] -eq "yes") {$liveavail="Available"} else {$liveavail="Unavailable"}
#create the live of csv content from the two input csv files
$outputline=$dbexcel.name[$i] + "," + $liveexcel.'connection status'[$i] + "," + $dbexcel.version[$i] + "," + $dbavail + "," + $liveavail
#output that line to the csv file
$outputline | out-file $outputexcel -Append
should do what you're looking for, or give you enough to edit it to your exact need.
I've assumed that the dbexcel.csv and liveexcel.csv files live in the root of c:\ for this, that they include the header information, and that the outputexcel.csv file will be saved to the same place (including headers).
Maybe you are looking for
-
I'd like some help from an OP to sort out my speed...
Hi, Don't want to be flogging the same old horse but i'm having similar issues to others that don't seem to get resolved. Had Infinity for about 7 weeks. On install engineer getting 37mb at socket. Week 1 - fabulous - 37mb streaming a pleasure, HD gr
-
HT4367 How to update Apple TV firmware 6 beta when beta expired?
I installed Apple TV firmware 6.1.4. Now I see a message that that firmare has expired, and instructions to update connecting the Apple TV to a computer with iTunes. I did that, but iTunes downloads firmware version 5.3 which isn't capable to install
-
HELP URGENT : accessing servlets thru localhost
Hi I am trying to access a servlet on iplanet from another servlet through forms . when i specifically give the machine name and point the form to <form name=form1 action=https://avenger/servlet/LoopQualServlet method=GET>") where avenger is my local
-
Hello everyone I am really new to the mac world and loving it but one thing I would like to be able to do is have a email with the content as my cover letter in it and a file attachment being my resume in .PDF format attached to the email I want to
-
Delivery not created in background for VL10 or VL10BATCH
Hello, My client is using ECC6.0. When I schedule a job in background for VL10 or VL10BATCH, only the delivery due list appears. No deliveries are created. Also referred to OSS note '310022 - VL10: Schedule batch job/variant creation' and changed