Structure of csv file
1)i wanted to know the structure of csv file,
2)how to read data from csv file .
Use the library at http://ostermiller.org/utils/CSV.html
Similar Messages
-
SSRS to export report as "structured" csv file (of sorts) . . .
I'm trying to use SQL Server Reporting Services (SQL Server 2008 R2) to produce a CSV file. Row 1 in the CSV has to be a summary row with 8 columns. The detail rows which follow have 24 columns. The data in the summary row is "static"
except for a date, a count of detail rows, and a total amount due based on the detail rows that follow. Here's an example of what we need it to look like:
HDR,4242,0,1,20150203,25,I,25823.18,,,,,,,,,,,,,,,,
DTL,4242,0,1,20150203,255092,20150129,989,C,Net 0,Due Upon Receipt,12703,Some Super Customer,1001 Grandview Dr,,SomeCity,TX,US,75012,9729990000,,,,
DTL,4242,0,1,20150203,255093,20150129,1360,C,Net 0,Due Upon Receipt,23774,Another Awesome Customer,52 Six Flags Dr,,DeepInTheHeart,TX,US,76006,8174445555,,,,
I've been able to get the report itself to render correctly in Visual Studio or from a browser using several different approaches but they all fail in one way or another when I try to save it as a CSV (eg, a header is prepended to each detail so they end up
side by side, header and detail end up with extra columns, etc).
I'm clearly a far cry from an SSRS expert but this seems like it should be easy. Can someone who
is an SSRS expert (or at least knows more than me) give me a hint or two? Can this even be done?
thanks in advance for your help!Hi there -
Thanks for your response. I apologize for the delay. I think the
difference in your screenshot and my scenario is that I have a "header" row of sorts whereas you have no such row. In my scenario, the first row of the CSV file must have a record type of "TRL". This row summarizes certain elements
of the rows to follow which are of record type "DTL". Specifically, the TRL row, provides a total number of DTL rows in its 6th column and a total dollar amount of the DTL rows in its 8th column. See here for an example:
So in the report shown above, there are 177 DTL rows with a total dollar amount of $301,646.20 and it looks to be in the right format/structure.
However, when I export it to CSV, this is what I get:
Hopefully that demonstrates the situation but if not, let me know.
respectfully,
java_dude -
If CSV file contains multiple structure...
If CSV file contains multiple structures...then how should set vaules in file content conversion.
pls mention any links regarding File Content Conversion
thanks in advance..
RameshHi,
You are using RecordSet. Here are some scenarios.
http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/frameset.htm
/people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
/people/prateek.shah/blog/2005/06/08/introduction-to-idoc-xi-file-scenario-and-complete-walk-through-for-starters - IDoc to File
/people/ravikumar.allampallam/blog/2005/03/14/abap-proxies-in-xiclient-proxy - ABAP Proxy to File
/people/sap.user72/blog/2005/06/01/file-to-jdbc-adapter-using-sap-xi-30 - File to JDBC
/people/prateek.shah/blog/2005/06/14/file-to-r3-via-abap-proxy - File to ABAP Proxy
/people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1 - File to File Part 1
/people/arpit.seth/blog/2005/06/27/rfc-scenario-using-bpm--starter-kit - File to RFC
https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/1685 [original link is broken] [original link is broken] [original link is broken] [original link is broken] - File to Mail
Regards,
Wojciech -
Exact file-structure of .csv to import without problems into AddressBook?
Hello,
i tried to import a .csv-file into my osx-address-book. Everything seemed to be all right but after klicking OK for the import nothing happened. I had to cancel. I think that the structure of my .csv-file is not compatible with address book. If i use AddressBookImporter everything is working but I guess that it's easier to import the data directly so my question is if anybody here knows more about the structure of the .csv-file address book expects?
thank you.
Andrefield1,field2,field3,return
field1,field2,field3,return
As long as the fields are in the same sequence, it doesn't matter what sequence it is. When AB tries the import it will give you a dialog box allowing you to associate each field in your file with the appropriate field (last, first, email etc) in AB.
If this isn't appearing then you should get an error message saying the file is not csv.
Make sure when you click Import that you select the text file option.
You could try pasting this into text edit and saving it - I have just tried it here and it worked.
Joe,Smith,[email protected]
Pete,Murphy,[email protected]
AK -
Modifying the structure of the std CSV file in SNC FTR
Hi All,
We are using the File for SNI Monitor in SNC 7.0 Ehp1 to facilitate the suppliers in uploading their Inventory data (Unrestricted Stock on Hand) for their location products.
The issue with the std. CSV file downloaded from the system is that it has all the key figures from the SNI Monitor which we are not interested in. We want the supplier to just see the following key figures for their location products:
1. Unrestricted-Use Stock- New (its an editable field which the supplier updates in Period0)
2. Unrestricted-Use Stock- Original (which contains the current inventory data in the SNC system for the location product)
Now our concerns are:
a. We are not able to apply the filter to the CSV file for the above key figures as a blank line between location products prevents filtering. This makes it very cumbersome for the supplier to update the stock data for 200+ skus using the std. file
b. We tried formatting the std. file but ended up with errors on upload.
Can anyone share their inputs on how to simplify the structure of the Std. CSV file.
Note: We are looking for std. solutions.
Regards,
BharathHi Mayari,
though you were successfull with
METHOD cl_alv_table_create=>create_dynamic_table
I must warn you not to use it. The reason is that the number of tables created is limited, the method uses GENERATE SUBROUTINE statement and this triggers an unwanted database commit.
If you know the DDIC structure, it is (starting with ECC6.0) much easier:
field-symbols:
<table> type standard table.
data:
lr_data type ref to data.
Create data lr_data type table of (<DDIC structure>).
assign lr_data->* to <table>.
The split code can be simplified gaining speed loosing complexity not loosing functionality.
field-symbols:<fs_s> type any.
field-symbols:<fs_t> type any.
SPLIT lv_rec AT ';' INTO table it_string.
loop at it_string assigning <fs_s>.
assign component sy-tabix of wa_string to <fs_t>.
if sy-subrc = 0.
<fs_t> = <fs_s>.
endif.
at last.
append <fs_itwa3> to <ft_itab3>.
endat.
endloop.
Though it may work as Keshav.T suggested, there is no need to do that way.
Regards,
Clemens -
How to extract data from an arbitrary xml file and export in a nice csv file?
Hallo,
I'm facing big problems in the use of XML files. I have an
application which generates XML files with clusters containing arrays
and scalars like in the example pasted below. My task is to
read it and export the data in a human-friendly CSV document.
Since I don't know the actual content of the cluster, I need some kind
of intelligent vi which goes through the XML file looking for arrays
and other data structures in order to export them properly in the CSV
format (columns with headers).
Thank you
<Cluster>
<Name></Name>
<NumElts>3</NumElts>
<Array>
<Name></Name>
<Dimsize>6</Dimsize>
<I32>
<Name></Name>
<Val>0</Val>
</I32>
<I32>
<Name></Name>
<Val>1</Val>
</I32>
<I32>
<Name></Name>
<Val>2</Val>
</I32>
<I32>
<Name></Name>
<Val>3</Val>
</I32>
<I32>
<Name></Name>
<Val>4</Val>
</I32>
<I32>
<Name></Name>
<Val>5</Val>
</I32>
</Array>
<DBL>
<Name></Name>
<Val>3.14159265358979</Val>
</DBL>
<String>
<Name></Name>
<Val>ciao</Val>
</String>
</Cluster>
Solved!
Go to Solution.Thank you again,
I'm forwarding my vi draft with many comments and an xml file sample.
Data in cluster is stored according to the LabVIEW schema, infact it is generated by LabVIEW.
What I'm trying to do is to access the element of the cluster and read their content using the Invoke node and Property node functions. Could you give it a look, there may be something wrong, I'm not able to access cluster children.
Which funcions should I use? Could you give me an example? You may use the draft I enclosed...
Then write these data in a csv file
should be the easier part.
BODY{font:x-small 'Verdana';margin-right:1.5em}
.c{cursor:hand}
.b{color:red;font-family:'Courier New';font-weight:bold;text-decoration:none}
.e{margin-left:1em;text-indent:-1em;margin-right:1em}
.k{margin-left:1em;text-indent:-1em;margin-right:1em}
.t{color:#990000}
.xt{color:#990099}
.ns{color:red}
.dt{color:green}
.m{color:blue}
.tx{font-weight:bold}
.db{text-indent:0px;margin-left:1em;margin-top:0px;margin-bottom:0px;padding-left:.3em;border-left:1px solid #CCCCCC;font:small Courier}
.di{font:small Courier}
.d{color:blue}
.pi{color:blue}
.cb{text-indent:0px;margin-left:1em;margin-top:0px;margin-bottom:0px;padding-left:.3em;font:small Courier;color:#888888}
.ci{font:small Courier;color:#888888}
PRE{margin:0px;display:inline}
<?xml
version="1.0" encoding="iso-8859-1" ?>
<Contents type="Data"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="XMLSection.xsd">
<section name="beta"
date="7/31/2009" time="3:43:03 PM" version="1.0">
<Cluster>
<Name />
<NumElts>1</NumElts>
<Array>
<Name />
<Dimsize>4</Dimsize>
<DBL>
<Name>number: 0 to 1</Name>
<Val>0.93317638164326</Val>
</DBL>
<DBL>
<Name>number: 0 to 1</Name>
<Val>0.79233924020314</Val>
</DBL>
<DBL>
<Name>number: 0 to 1</Name>
<Val>0.39199947274518</Val>
</DBL>
<DBL>
<Name>number: 0 to 1</Name>
<Val>0.74817197429441</Val>
</DBL>
</Array>
</Cluster>
</section>
</Contents>
Attachments:
read_array.vi 12 KB -
How can I email using UTL_SMTP with a csv file as an attachment?
Dear All,
It would be great if someone could help me. I am trying to use UTL_SMTP to email with a csv file as attachment. I do get an email with a message but no attachment arrives with it.
In fact the code used for attaching the csv file gets appended in the message body in the email.
CREATE OR REPLACE PROCEDURE test_mail
AS
SENDER constant VARCHAR2(80) := '[email protected]';
MAILHOST constant VARCHAR2(80) := 'mailhost.xxxx.ac.uk';
mail_conn utl_smtp.connection;
lv_rcpt VARCHAR2(80);
lv_mesg VARCHAR2(9900);
lv_subject VARCHAR2(80) := 'First Test Mail';
lv_brk VARCHAR2(2) := CHR(13)||CHR(10);
BEGIN
mail_conn := utl_smtp.open_connection(mailhost, 25) ;
utl_smtp.helo(mail_conn, MAILHOST) ;
dbms_output.put_line('Sending Email to : ' ||lv_brk||'Suhas Mitra' ) ;
lv_mesg := 'Date: '||TO_CHAR(sysdate,'dd Mon yy hh24:mi:ss')||lv_brk||
'From: <'||SENDER||'>'||lv_brk||
'Subject: '||lv_subject||lv_brk||
'To: '||'[email protected]'||lv_brk||
'MIME-Version: 1.0'||lv_brk||
'Content-type:text/html;charset=iso-8859-1'||lv_brk||
' boundary="-----SECBOUND"'||
''||lv_brk||
'-------SECBOUND'||
'Some Message'
|| lv_brk ||
'-------SECBOUND'||
'Content-Type: text/plain;'|| lv_brk ||
' name="xxxx.csv"'|| lv_brk ||
'Content-Transfer_Encoding: 8bit'|| lv_brk ||
'Content-Disposition: attachment;'|| lv_brk ||
' filename="xxxx.csv"'|| lv_brk ||
lv_brk ||
'CSV,file,attachement'|| lv_brk || -- Content of attachment
lv_brk||
'-------SECBOUND' ;
dbms_output.put_line('lv_mesg : ' || lv_mesg) ;
utl_smtp.mail(mail_conn, SENDER) ;
lv_rcpt := '[email protected]';
utl_smtp.rcpt(mail_conn, lv_rcpt) ;
utl_smtp.data(mail_conn, lv_mesg) ;
utl_smtp.quit(mail_conn);
EXCEPTION
WHEN utl_smtp.transient_error OR utl_smtp.permanent_error THEN
NULL ;
WHEN OTHERS THEN
dbms_output.put_line('Error Code : ' || SQLCODE) ;
dbms_output.put_line('Error Message : ' || SQLERRM) ;
utl_smtp.quit(mail_conn) ;
END;LKBrwn_DBA wrote:
Use UTL_MAIL instead.That package is an utter disappointment - and an excellent example IMO of how not to design an application programming interface. Even the source code is shoddy.. I mean, having to resort to a GOTO statement....!!?? The person(s) who wrote that package are sorely lacking in even the most basic of programming skills if structured programming is ignored and a spaghetti command used instead.
No wonder the public interface of that code is equally shabby and thoughtless... The mail demo code posted by Oracle was better written than this "+package+" they now have bundled as the official Mail API.
I dunno.. if I was in product management there would have been hell to pay over pushing cr@p like that to customers. -
Hi, so I have a CSV file in my working directory and dont know how to load in into a graph in VB. I already loaded my chart into the form and its called "Chart1" with "Series1".
The CSV has the following structure:
Date Value1 Value2
27/04/2012 45.1 60.1
26/04/2012 45.2 60.2.. ..
I cant figure out how to extract this data from the "*.csv" file and put in it a graph where the
Date is the x axis and the
Value2 is the y axis.
Any code would be greatly appreciated!Is the CSV file data in reverse date order? Your example includes only two lines, so the sequence is not clear.
If so, then you want to add the items to the series by working from the end of the file towards the beginning of the file. That would plot the points along the x-axis in order of increasing date. Is that correct?
But then you say that you want to plot from the beginning to a certain date. Do you mean that you want to plot (reversed) from the beginning of the file to a certain nominated ending date? Or do you want to plot the items starting at the end of the file
and working backwards through the file so that the plot starts at a nominated starting date? I have assumed the former.
To adjust the above code for these new requirements you should load the data into an object that you can manipulate to your requirements. Then you need to separate the plotting from the file access. A list is more suitable than an array.
Dim Readings As List(Of Reading) = New List(Of Reading)
Dim StartDate As Date = New Date(2012, 4, 24)
Private Sub Form1_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load
Dim CurrentPoint As Integer = 0
Using MyReader As New Microsoft.VisualBasic.FileIO.TextFieldParser("C:\Users\Public\Readings.txt")
MyReader.TextFieldType = Microsoft.VisualBasic.FileIO.FieldType.Delimited
MyReader.Delimiters = New String() {","}
Dim currentRow As String()
While Not MyReader.EndOfData
Try
currentRow = MyReader.ReadFields()
Dim D As String = currentRow(0)
Try
Dim Y As Double = CDbl(currentRow(2))
Readings.Add(New Reading(D, Y))
Catch ex As InvalidCastException
If CurrentPoint <> 0 Then
MsgBox("Item " & currentRow(2) & " is invalid. Skipping")
End If
End Try
Catch ex As Microsoft.VisualBasic.FileIO.MalformedLineException
If CurrentPoint <> 0 Then
MsgBox("Line " & ex.Message & " is invalid. Skipping")
End If
End Try
End While
End Using
Readings.Reverse()
CurrentPoint = 0
Chart1.Series(0).Points.Clear()
For Each R As Reading In Readings
Dim D As Date = CDate(R.Read_Date)
If D >= StartDate Then
Chart1.Series(0).Points.AddY(R.Read_Data)
Chart1.Series(0).Points(CurrentPoint).AxisLabel = R.Read_Date
CurrentPoint += 1
End If
Next
End Sub
That code uses the following class:
Public Class Reading
Public Read_Date As String
Public Read_data As Double
Public Sub New(ByVal Item1 As String, ByVal item2 As Double)
Read_Date = Item1
Read_data = item2
End Sub
End Class
I know this thread is a couple years old but I tried this code and it works in Studio 2013 Community and it is very close to what I need. I want to make it a line chart and I want to have 4 series- Temp, Temp_SP, RH% and RH%_SP so I can log some trending
data from an environmental chamber at work. Can someone help me out with this?
Thanks,
Dan -
Loading records from .csv file to SAP table via SAP Program
Hi,
I have a .csv file with 132,869 records and I am trying to load it to an SAP table with a customized SAP program.
After executing the program, only 99,999 records are being loaded into the table.
Is there some setting to define how many records can be loaded into a table? Or what else could be the problem?
Pls advice.
Thanks!!!hi Arun ,
A datasource need a extract structure to fetch data .It is nothing but a temp table to hold data.
First you need to create atable in SE11 with fields coming from CSV file.
Then you need to write a report program to read you CSV file and populate your table in BW .
Then you can create a datasource on top of this table .
After that replicate and load data at PSA and use to upper flow.
Regards,
Jaya Tiwari -
Generating CSV file with column names and data from the MySQL with JAVA
Hi all,
Give small example on ...
How can I add column names and data to a CSV from from MySQL.
like
example
sequence_no, time_date, col_name, col_name
123, 27-apr-2004, data, data
234, 27-apr-2004, data, data
Pls give small exeample on this.
Thanks & Regards
Rama KrishnaHello Rama Krishna,
Check this code:
Example below exports data from MySQL Select query to CSV file.
testtable structure
CREATE TABLE testtable
(id INT NOT NULL AUTO_INCREMENT PRIMARY KEY,
text varchar(45) NOT NULL,
price integer not null);
Application takes path of output file as an argument.
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.Statement;
import java.sql.ResultSet;
import java.sql.SQLException;
public class automateExport {
public static void main(String[] args) {
DBase db = new DBase();
Connection conn = db.connect(
"jdbc:mysql://localhost:3306/test","root","caspian");
if (args.length != 1) {
System.out.println(
"Usage: java automateExport [outputfile path] ");
return;
db.exportData(conn,args[0]);
class DBase {
public DBase() {
public Connection connect(String db_connect_str,
String db_userid, String db_password) {
Connection conn;
try {
Class.forName("com.mysql.jdbc.Driver").newInstance();
conn = DriverManager.getConnection(db_connect_str,
db_userid, db_password);
} catch(Exception e) {
e.printStackTrace();
conn = null;
return conn;
public void exportData(Connection conn,String filename) {
Statement stmt;
String query;
try {
stmt = conn.createStatement(ResultSet.TYPE_SCROLL_SENSITIVE,
ResultSet.CONCUR_UPDATABLE);
//For comma separated file
query = "SELECT id,text,price into OUTFILE '"+filename+
"' FIELDS TERMINATED BY ',' FROM testtable t";
stmt.executeQuery(query);
} catch(Exception e) {
e.printStackTrace();
stmt = null;
Greetings,
Praveen Gudapati -
Hi,
I am doing a File (SalesData.csv) -> PI -> IDOC scenario and have created a DT, MT and also made FCC configuration in sender File Channel.
However, when I run the scenario I get an error message in Channel Monitoring.
CSV File:
12,36,45,78,89
2154,789,65,78,99
This CSV file should be converted into an XML by FCC like:
<MT_SalesData>
<Data>12,36,45,78,89
2154,789,65,78,99</Data>
</MT_SalesData>
I used FCC in the channel like:
Document Name: MT_SalesData
Document Namespace: actual namespace of the MT
Recirdset Name: MT_SalesData
Recordset Structure: Data,1
Error in Channel Monitoring:
Conversion initialization failed: java.lang.Exception: java.lang.Exception: java.lang.Exception: Error(s) in XML conversion parameters found: Parameter 'Data.fieldFixedLengths' or 'Data.fieldSeparator' is missing Mandatory parameter 'Data.fieldNames': no value found
What more information should I include in the FCC? If required I can change the DT.
Entire content of the SalesData.csv file should be placed inside the Data tag, as is without any modification.
Thanks
PankajHi Pankaj,
Your file content conversion configuration is incomplete.
Please check below link for the addtional parameters you need to specify:
[http://help.sap.com/saphelp_nwpi71/helpdata/en/44/682bcd7f2a6d12e10000000a1553f6/frameset.htm]
Please check if you have maintained following parameters:
a. NameA.fieldFixedLengths or NameA.fieldSeparator
b. NameA.fieldNames
Regards,
Beena. -
Getting Issue while uploading CSV file into internal table
Hi,
CSV file Data format as below
a b c d e f
2.01E14 29-Sep-08 13:44:19 2.01E14 SELL T+1
actual values of column A is 201000000000000
and columen D is 201000000035690
I am uploading above said CSV file into internal table using
the below coding:
TYPES: BEGIN OF TY_INTERN.
INCLUDE STRUCTURE KCDE_CELLS.
TYPES: END OF TY_INTERN.
CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT'
EXPORTING
I_FILENAME = P_FILE
I_SEPARATOR = ','
TABLES
E_INTERN = T_INTERN
EXCEPTIONS
UPLOAD_CSV = 1
UPLOAD_FILETYPE = 2
OTHERS = 3.
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
am getting all columns data into internal table,
getting problem is columan A & D. am getting values into internal table for both; 2.01E+14. How to get actual values without modifying the csv file format.
waiting for your reply...
thanks & regards,
abhiHi Saurabh,
Thanks for your reply.
even i can't double click on those columns.
b'se the program needs be executed in background there can lot of csv file in one folder. No manual interaction on those csv files.
regards,
abhi -
Complex file content conversion in case of CSV file
Hi Friends,
What you see below is generated from Excel i.e excel file saved as CSV file. This file i need to map in XI.
Problem is i have to take only the data and not the header part.
for e.g in this particular line
Employee ID :,,E00315
I need only E0315 and not the header value. Similarly i have to find some solution to map all the required data.
Please suggest me how to use file content conversion in such scenarios.
Solution is required on top priority. Points will be immdly rewarded.
Thanks & Regards
K.Ramesh
,,Time Sheet,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
,,,,,,,,,,Ref. No: T-PRO-01-011,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
,,,,,,,,,,Page No.: 01,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Employee ID :,,E00315,,,Sale Order No / Line Item :,,,SO123456,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Employee Name :,,K.RAMESH,,,Client Name :,,,NCLIENTELE,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Month / Week :,,Jul-08,,,Project Name :,,,Internal Project,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Sales Team :,,,,,Project Role Start date :,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Location :,,BANGALORE,,,Project Role End date :,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Sl. No.,Date,Day,AA Type, Task Description,,,,,No.of Hours,Remarks,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
1,7/1/2008,Tue,0814-Talent acquitision,task 1,,,,,12.00,rmk 1,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
2,7/2/2008,Wed,0814-Talent acquitision,task 2,,,,,10.00,rmk 2,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
3,7/3/2008,Thu,0814-Talent acquitision,task 3,,,,,12.00,rmk 3,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
4,7/4/2008,Fri,0814-Talent acquitision,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
5,7/5/2008,Sat,0814-Talent acquitision,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
6,7/6/2008,Sun,0950-Holiday,,,,,,,,,,,,,,,,,,,,,,,,, ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
7,7/7/2008,Mon,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
8,7/8/2008,Tue,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, , , ,,,,,,,,,,,,,,,,,,,,
9,7/9/2008,Wed,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
10,7/10/2008,Thu,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
11,7/11/2008,Fri,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,, , ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
12,7/12/2008,Sat,0950-Holiday,,,,,,,,,,,,,,,,,,,,,,,,,,,, ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
13,7/13/2008,Sun,0950-Holiday,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, ,,,,,,,,,,,,,,,,,,,,
14,7/14/2008,Mon,0900-Paid leave,,,,,,,,,,,,,,,,,,,,,,,,,, ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
15,7/15/2008,Tue,0900-Paid leave,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
16,7/16/2008,Wed,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
17,7/17/2008,Thu,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
18,7/18/2008,Fri,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
19,7/19/2008,Sat,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
20,7/20/2008,Sun,0950-Holiday,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
21,7/21/2008,Mon,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
22,7/22/2008,Tue,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
23,7/23/2008,Wed,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
24,7/24/2008,Thu,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
25,7/25/2008,Fri,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
26,7/26/2008,Sat,0814-Talent acquitision,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
27,7/27/2008,Sun,0950-Holiday,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
28,7/28/2008,Mon,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
29,7/29/2008,Tue,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
30,7/30/2008,Wed,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
31,7/31/2008,Thu,0804-Development,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Prepared By:,,,,,Approved By:,Company Project Manager,,,,Client Project Manager,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Name,,K.RAMESH,,,Name,YYY,,,,ZZZ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Signature,,,,,Signature,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
Date ,,31.07.2008,,,Date ,31.07.2008,,,,31.07.2008,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,Hi,
In this CSV file, what all i need is the value E00315 after the label Employee ID. I used in FCC the parameters u asked me to do. Also in the Document Offset i gave value 3 so that the first three lines are ignored.
1,,,Time Sheet,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
1,,,,,,,,,,,Ref. No: T-PRO-01-011,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
1,,,,,,,,,,,Page No.: 01,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
1,Employee ID :,E00315
But still i am getting the error
Conversion of file content to XML failed at position 0: java.lang.Exception: ERROR converting document line no. 4 according to structure 'tims_header':java.lang.Exception: ERROR in configuration / structure 'tims_header.': More elements in file csv structure than field names specified!
Please help.
Thanks
Ramesh -
Data formatting and reading a CSV file without using Sqlloader
I am reading a csv file to an Oracle table called sps_dataload. The table is structured based on the record type of the data at the beginning of
each record in the csv file. But the first two lines of the file are not going to be loaded to the table due to the format.
Question # 1:
How can I skip reading the first two lines from my csv file?
Question # 2:
There are more fields in the csv file than there are number of columns in my table. I know I can add filler as an option, but then there are
about 150 odd fields which are comma-separated in the file and my table has 8 columns to load from the file. So, do I really have to use filler
for 140 times in my script or, there is a better way to do this?
Question # 3:
This is more of an extension of my question above. The csv file has fields with block quotes - I know this could be achieved in sql loader when we mention Occassionally enclosed by '"'.
But can this be doable in the insert as created in the below code?
I am trying to find the "wrap code" button in my post, but do not see it.
Heres my file layout -
PROSPACE SCHEMATIC FILE
; Version 2007.7.1
Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1
This is my table structure -
desc sps_dataload;
File_Name Varchar2 (50) Not Null,
Record_Layer Varchar2 (20) Not Null,
Level_Id Varchar2 (20),
Desc1 Varchar2 (50),
Desc2 Varchar2 (50),
Desc3 Varchar2 (50),
Desc4 Varchar2 (50)
Heres my code to do this -
create or replace procedure insert_spsdataloader(p_filepath IN varchar2,
p_filename IN varchar2,
p_Totalinserted IN OUT number) as
v_filename varchar2(30) := p_filename;
v_filehandle UTL_FILE.FILE_TYPE;
v_startPos number; --starting position of a field
v_Pos number; --position of string
v_lenstring number; --length of string
v_record_layer varchar2(20);
v_level_id varchar2(20) := 0;
v_desc1 varchar2(50);
v_desc2 varchar2(50);
v_desc3 varchar2(50);
v_desc4 varchar2(50);
v_input_buffer varchar2(1200);
v_delChar varchar2(1) := ','
v_str varchar2(255);
BEGIN
v_Filehandle :=utl_file.fopen(p_filepath, p_filename, 'r');
p_Totalinserted := 0;
LOOP
BEGIN
UTL_FILE.GET_LINE(v_filehandle,v_input_buffer);
EXCEPTION
WHEN NO_DATA_FOUND THEN
EXIT;
END;
-- this will read the 1st field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,1);
v_lenString := v_Pos - 1;
v_record_layer := substr(v_input_buffer,1,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 2nd field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,2);
v_lenString := v_Pos - v_startPos;
v_desc1 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 3rd field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,3);
v_lenString := v_Pos - v_startPos;
v_desc2 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 4th field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,4);
v_lenString := v_Pos - v_startPos;
v_desc3 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
-- this will read the 5th field from the file --
v_Pos := instr(v_input_buffer,v_delChar,1,5);
v_lenString := v_Pos - v_startPos;
v_desc4 := substr(v_input_buffer,v_startPos,v_lenString);
v_startPos := v_Pos + 1;
v_str := 'insert into table sps_dataload values('||v_filename||','||v_record_layer||','||v_level_id||','||v_desc1||','||v_desc2||','||v_desc3||','||v_desc4||')';
Execute immediate v_str;
p_Totalinserted := p_Totalinserted + 1;
commit;
END LOOP;
UTL_FILE.FCLOSE(v_filehandle);
EXCEPTION
WHEN UTL_FILE.INVALID_OPERATION THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20051, 'sps_dataload: Invalid Operation');
WHEN UTL_FILE.INVALID_FILEHANDLE THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20052, 'sps_dataload: Invalid File Handle');
WHEN UTL_FILE.READ_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20053, 'sps_dataload: Read Error');
WHEN UTL_FILE.INVALID_PATH THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20054, 'sps_dataload: Invalid Path');
WHEN UTL_FILE.INVALID_MODE THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20055, 'sps_dataload: Invalid Mode');
WHEN UTL_FILE.INTERNAL_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20056, 'sps_dataload: Internal Error');
WHEN VALUE_ERROR THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE_APPLICATION_ERROR(-20057, 'sps_dataload: Value Error');
WHEN OTHERS THEN
UTL_FILE.FCLOSE(v_FileHandle);
RAISE;
END insert_spsdataloader;
/Justin, thanks. I did happen to change my pl sql procedure using utl_file.get_file and modifying the instr function based on position of ',' in the file, but my procedure is getting really big and too complex to debug. So I got motivated to use external tables or sql loader as plan b.
As I was reading more about creating an external table as an efficient way and thus believe I can perhaps build an extern table with my varying selection from the file. But I am still unclear if I can construct my external table by choosing different fields in a record based on a record identifier string value (which is the first field of any record). I guess I can, but I am looking for the construct as to how am I going to use the instr function for selecting the field from the file while creating the table.
PROSPACE SCHEMATIC FILE
; Version 2007.7.1
Project,abc xyz Project,,1,,7,1.5,1.5,1,1,0,,0,1,0,0,0,0,3,1,1,0,1,0,0,0,0,2,3,1,0,1,0,0,0,0,3,3,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
Subproject,25580-1005303.pst,,102,192,42,12632256,1,1,102,192,42,1,12632256,0,6,1,0,32896,1,0,0,0,0,,,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,-1,-1,-1,-1,0,0,0,-1,-1,0,1,1,,,,,,1
Segment, , , 0, 102, 0, 0, 0, 0, 0, 0, 0, 0, 0, , , , , , , , , , , 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, -1, 0, , , 1
Product,00093097000459,26007,2X4 MF SF SD SOLR,,28.25,9.5,52.3, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.60,0,0,0,0,00,-1,0
Product,00093097000329,75556,"22""X22"" BZ CM DD 1548",,27,7,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,345.32
Product,00093097000336,75557,"22""X46"" BZ CM XD 48133",,27,7.5,51, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,0
Product,00093097134833,75621,"22""X22"" BZ CM/YT DD 12828",,27,9,27, 8421504,,0,,xyz INC.,SOLAR,,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,,0,0,0,0,1,000000.20,0,0,0,0,0,0,0,,1For example, if I want to create an external table like this -
CREATE TABLE extern_sps_dataload
( record_layer VARCHAR2(20),
attr1 VARCHAR2(20),
attr2 VARCHAR2(20),
attr3 VARCHAR2(20),
attr4 VARCHAR2(20)
ORGANIZATION EXTERNAL
( TYPE ORACLE_LOADER
DEFAULT DIRECTORY dataload
ACCESS PARAMETERS
( RECORDS DELIMITED BY NEWLINE
BADFILE dataload:'sps_dataload.bad'
LOGFILE dataload:'sps_dataload.log'
DISCARDFILE dataload:'sps_dataload.dis'
SKIP 2
VARIABLE 2 FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' LRTRIM
MISSING FIELD VALUES ARE NULL
+LOAD WHEN RECORD_LAYER = 'PROJECT' (FIELD2, FIELD3,FIELD7,FIELD9)+
+LOAD WHEN RECORD_LAYER= 'PRODUCT' (FIELD3,FIELD4,FIELD8,FIELD9)+
+LOAD WHEN RECORD_LAYER= 'SEGMENT' (FIELD1,FIELD2,FIELD4,FIELD5)+ LOCATION ('sps_dataload.csv')
REJECT LIMIT UNLIMITED;
{code}
While I was reading the external table documentation, I thought I could achieve similar things by using position_spec option, but I am not getting behind its parameters. I have highlighted italics in the code above(from LOAD WHEN....FIELDS....), the part I think I am going to use, but not sure of it's construct.
Thank you for your help!! Appreciate your thoughts on this..
Sanders. -
How to solve extra commas in CSV files in FCC..!!
Hi,
I have the CSV file with comma(,) separator which has around 50 fields. Among them there were some description fields which has comma (ex: karan,kumar) which has to come as a single field and comes as 51 fields instead of 50.
Due to this, im facing problem at File content conversion and it is failing with error " ERROR converting document line no. 8534 according to structure 'Header':java.lang.Exception: ERROR in configuration / structure 'Header.': More elements in file csv structure than field names specified!
We have around 10 fields which comes in this way.Any idea how to solve this issue?
Thanks
Deepthi.Hello Deepthi,
The option here is use fieldFixedLengths instead of fieldSeparator. By this way evethough you have more fields in file than given filednames, it will not trigger any error. (But this one will still keep the commas which you might not need in the xml structure, though you can replace these in mapping anyway this is one solution )
enclosureSign option you can try, but you need do include the delimeter in you file structure, which is again a troublesum.
eg:- field1, | karan,kumar|,fieldn
In case you are familier with Adapter Module you can do it easily, but its a final option as far i am concern.
Regards,
Prasanna
Maybe you are looking for
-
In Report Painter, need to find out how a field is defined.
I'm trying to find out how a field used in a current report is defined. For example, columns 2-13 are monthly columns. Double clicking on the column name (April), for example, brings up the box "Element Definition: April;$Year. The Value Field is
-
Creating material bom with reference to Sales order bom.
How to create material bom by coping Sales order bom.?
-
Merging collage of pictures on web page???
Hello everyone I have been stuck on this and could not find any posts on this subject throughout the forums. If you guys can figure this out you geniuses'! I am trying to figure out how to merge all of these .jpg's I have on a collage on my page. I h
-
Operating system message: No such file or directory in DMS while archivng.
Dear Team, While i am trying to make the archiving i am getting the error " Operating system message: No such file or directory" in DMS. I cross checked all the forums and made the settings as suggested still i am not able to resolve this. In fact it
-
How can i convert MPEG file to midi or wav format
Hi, Can any one help me in converting MPEG music file to midi or wav format. I am trying to import it to my 3D application to run my animation but have to convert it to midi or wav format. Can anyone help me please. Audition can do this job ? Steve