Combine csv files and insert source filename
Newbie powershell question - Need help combining multiple csv files (100+) and inserting a column containing the name of the source file
ex:
Apples.csv
A,B,C,D
1,2,3,4
Oranges.csv
A,B,C,D
1,2,3,4
Desired output would look like this:
combined.csv
A,B,C,D,Filename
1,2,3,4,Apples.csv
1,2,43,4,Oranges.csv
I did see a post with something similar to what I wanted to accomplish but it wouldn't work for me since my file names will change and I am not going to create a variable for each file
$CSV1 = Import-CSV c:\csv1.csv
$CSV2 = Import-CSV c:\csv2.csv
$Result = New-Object PSObject -Properties @{
Field1 = $CSV1.Field1
Field2 = $CSV1.Field2
Source1 = "CSV1"
Field3 = $CSV2.Field1
Field4 = $CSV2.Field2
Source2 = "CSV2"
$Result | Export-CSV c:\Result.csv -NoTypeInformation
Similar Messages
-
Hello everyone,
To move data from roughly 50 CSV files in a loop to SQL tables (table per CSV file), I've used BulkInsert task in For each loop container.
Please know, for all different columns of all CSV files, the filed length was specified as varchar(255).
It worked well for the first 6 files but on 7th file, it has found the data in one of the columns of CSV file, which exceeds the limit of 255 characters. I would like to truncate the data and insert the remaining data for this column. In other words, I would
like to insert first 255 characters into table for this field and let the package execute further.
Also, if I would use SQLBulkCopy in Script task, will it be able to resolve this problem? I believe, I would face the similar problem over there too. Although, would like to get confirmation for the same from experts.
Can you please advise how to get rid of this truncation error?
Any help would be greatly appreciated.
Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.comHello! I suggest you add a derived column transformation between Source and Destination, use string functions to extract first 255 characters from the incoming field and send this output to destination field, that ways you can get rid of these sort of issues.
Useful Links:
http://sqlage.blogspot.in/2013/07/ssis-how-to-use-derived-column.html
Good Luck!
Please Mark This As Answer if it solved your issue.
Please Vote This As Helpful if it helps to solve your issue -
SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table
Hello everyone,
I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current
file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.
On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?
I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns
in it. These files needs to be migrated to SQL tables using the optimum way.
Does anybody know which is the best way to setup the Dataflow task for this requirement?
Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.
Any help would be much appreciated. It's very urgent.
Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.comThe standard Data Flow Task supports only static metadata defined during design time. I would recommend you check the commercial COZYROC
Data Flow Task Plus. It is an extension of the standard Data Flow Task and it supports dynamic metadata at runtime. You can process all your input CSV files using a single Data Flow Task
Plus. No programming skills are required.
SSIS Tasks Components Scripts Services | http://www.cozyroc.com/ -
Read csv file and insert the data to a list in sharePoint
Hi everyone,
i wrote a code that reads from a csv file all his data but i need also to insert all the data to a new list in sharePoint.
how can i do this?
plus, i need to read from the csv file once a day in specific hour. how can i do this? thank you so much!!Did you look at the example I posted above?
ClientContext in CSOM will allow you to get a handle on the SharePoint objects;
http://msdn.microsoft.com/en-us/library/office/ee539976(v=office.14).aspx
w: http://www.the-north.com/sharepoint | t: @JMcAllisterCH | YouTube: http://www.youtube.com/user/JamieMcAllisterMVP -
Upload csv file and insert into a table for user
Hello,
i want to create a Page on which, the User of my Application can upload a semicolon delimited file and the data of the file should automatically be stored in a table.
I can upload the file and the file is then stored in the htmldb_application_files table.
But I do not no how to parse the file....
Can anyone help me or is there anyone who have done that before?
Thank you,
TimTim...
Here is what I did in a similar situation.
Let the user download a csv file to use as an excel turnaround document.
I check digit the primary key. They are not supposed to touch that column.
They do their excel thing adding data in columns next to the ones they are updating. They now have the original and new data on the same row in the excel document. They save it on a share drive as a csv. A perl script wakes up and parses the csv. Verify's the check digit, checks that the old values still exist in the table... etc, and then does the update if all is well at the row level. The csv is replaced showing the success or failure of the update on each row.
Probably lots of other ways to accomplish this but I have gotten years of use out of the script. The original csv can come out of almost any application. Mine come from apex, discoverer and some excel queries.
Bob -
How to read a CSV file and Insert data into an Oracle Table
Hi All,
I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .
Please let me some suggestions on this.
Thanks,
Chandra Rjeneesh wrote:
And, please don't "hijack" 5 year old thread..Better start a new one..I've just split it off to a thread of it's own. ;)
@OP,
I have a Clob file as a in parameter in my PROC. . File is comma separated.need procedure that would parse this CLOB variable and populate in oracle table .You don't have a "clob file" as there's no such thing. CLOB is a datatype for storing large character based objects. A file is something on the operating system's filesystem.
So, why have you stored comma seperated data in a CLOB?
Where did this data come from? If it came from a file, why didn't you use SQL*Loader or, even better, External Tables to read and parse the data into structured format when populating the database with it?
If you really do have to parse a CLOB of data to pull out the comma seperated values, then you're going to have to write something yourself to do that, reading "lines" by looking for the newline character(s), and then breaking up the "lines" into the component data by looking for commas within it, using normal string functions such as INSTR and SUBSTR or, if necessary, REGEXP_INSTR and REGEXP_SUBSTR. If you have string data that contains commas but uses double quotes around the string, then you'll also have the added complexity of ignoring commas within such string data.
Like I say... it's much easier with SQL*Loader of External Tables as these are designed to parse such CSV type data. -
Read a CSV file and dynamically generate the insert
I have a requirement where there are multiple csv's which needs to be exported to a sql table. So far, I am able to read the csv file and generate the insert statement dynamically for selected columns however, the insert statement when passed as a parameter
to the $cmd.CommandText
does not evaluate the values
How to evaluate the string in powershell
Import-Csv -Path $FileName.FullName | % {
# Insert statement.
$insert = "INSERT INTO $Tablename ($ReqColumns) Values ('"
$valCols='';
$DataCols='';
$lists = $ReqColumns.split(",");
foreach($l in $lists)
$valCols= $valCols + '$($_.'+$l+')'','''
#Generate the values statement
$DataCols=($DataCols+$valCols+')').replace(",')","");
$insertStr =@("INSERT INTO $Tablename ($ReqColumns) Values ('$($DataCols))")
#The above statement generate the following insert statement
#INSERT INTO TMP_APPLE_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )
$cmd.CommandText = $insertStr #does not evaluate the values
#If the same statement is passed as below then it execute successfully
#$cmd.CommandText = "INSERT INTO TMP_APL_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )"
#Execute Query
$cmd.ExecuteNonQuery() | Out-Null
jyeragiHi Jyeragi,
To convert the data to the SQL table format, please try this function out-sql:
out-sql Powershell function - export pipeline contents to a new SQL Server table
If I have any misunderstanding, please let me know.
If you have any feedback on our support, please click here.
Best Regards,
Anna
TechNet Community Support -
2.5 GB CSV file as data source for Crystal report
Hi Experts,
I was asked to create a crystal report using crystal report as datasource(CSV file that is pretty huge (2.4Gb)). Could you help with me any doc that expalins the steps mainly with data connectivity.
Objective is to create Crystal Report using that csv file as data source, save the report as .rpt with the data and send the results to customer to be read with Crystal Reports Viewer or save the results to PDF.
Please help and suggest me steps as I am new to crystal reports and CSV as source.
BR, Nanda KishoreNanda,
The issue of having some records with comma and some with a semi colon will need to be resolved before you can do an import. Assuming that there are no semi colons in any of the text values of the report, you could do a "Find & Replace" to convert the semi colons to commas.
If find & replace isn't an option, you'll need to get the files separately.
I've never used the Import Export Wizzard myself. I've always used the BULK INSERT command
It would look something like this...
BULK INSERT SQLServerTableName
FROM 'c:\My_CSV_File.csv'
WITH (FIELDTERMINATOR = ',')
This of course implies that your table has the same columns, in the same order as the csv files and that each column is the correct data type to accept the incoming data.
If you continue to have issues getting your data into SQL Server Express, please post in one of these two forums
[Transact-SQL|http://social.msdn.microsoft.com/Forums/en-US/transactsql/threads]
[SQL Server Express|http://social.msdn.microsoft.com/Forums/en-US/sqlexpress/threads]
The Transact-SQL forum has some VERY knowledgeable people (including MVPs and book authors) posing answers.
I've never posed to the SQL Server Express but I'm sure they can trouble shoot your issues with the Import Export Wizard.
If you post in one of them, please copy the post link back to this thread you I can continue to to help.
Jason -
// Code Help need .. in Reading CSV file and display the Output.
Hi All,
I am a new Bee in code and started learning code, I have stared with Console application and need your advice and suggestion.
I want to write a code which read the input from the CSV file and display the output in console application combination of first name and lastname append with the name of the collage in village
The example of CSV file is
Firstname,LastName
Happy,Coding
Learn,C#
I want to display the output as
HappyCodingXYZCollage
LearnC#XYXCollage
The below is the code I have tried so far.
// .Reading a CSV
var reader = new StreamReader(File.OpenRead(@"D:\Users\RajaVill\Desktop\C#\input.csv"));
List<string> listA = new List<string>();
while (!reader.EndOfStream)
var line = reader.ReadLine();
string[] values = line.Split(',');
listA.Add(values[0]);
listA.Add(values[1]);
listA.Add(values[2]);
// listB.Add(values[1]);
foreach (string str in listA)
//StreamWriter writer = new StreamWriter(File.OpenWrite(@"D:\\suman.txt"));
Console.WriteLine("the value is {0}", str);
Console.ReadLine();
Kindly advice and let me know, How to read the column header of the CSV file. so I can apply my logic the display combination of firstname,lastname and name of the collage
Best Regards,
Raja Village Sync
Beginer CoderVery simple example:
var column1 = new List<string>();
var column2 = new List<string>();
using (var rd = new StreamReader("filename.csv"))
while (!rd.EndOfStream)
var splits = rd.ReadLine().Split(';');
column1.Add(splits[0]);
column2.Add(splits[1]);
// print column1
Console.WriteLine("Column 1:");
foreach (var element in column1)
Console.WriteLine(element);
// print column2
Console.WriteLine("Column 2:");
foreach (var element in column2)
Console.WriteLine(element);
Mark as answer or vote as helpful if you find it useful | Ammar Zaied [MCP] -
Need to take a value from the csv file and query in a OAF page.
Hello,
I have a requirement to take the list of employee numbers in a csv file and display its corresponding job on the page.
I have created a item 'MessageFileupload' where the user will upload the csv file containing the employee number and a Button 'Display Jobs' which will display the corresponding jobs on the page.
Any idea how to take the values from the csv file and query it?
Regards,
den123.Hi ,
Check
http://oraclearea51.com/contribute/post-a-blog-article/csv-file-upload-for-oa-framework.html
http://www.roseindia.net/jsp/upload-insert-csv.shtml
Below code works from above blogs.
package xx.oracle.apps.pa.Lab.webui;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
// import java.io.*;
import oracle.apps.fnd.common.VersionInfo;
import oracle.apps.fnd.framework.OAApplicationModule;
import oracle.apps.fnd.framework.OAException;
import oracle.apps.fnd.framework.server.OAViewObjectImpl;
import oracle.apps.fnd.framework.webui.OAControllerImpl;
import oracle.apps.fnd.framework.webui.OAPageContext;
import oracle.apps.fnd.framework.webui.beans.OAWebBean;
import oracle.jbo.domain.BlobDomain;
import oracle.cabo.ui.data.DataObject;
import oracle.jbo.Row;
* Controller for ...
public class deptCsvUploadCO extends OAControllerImpl
public static final String RCS_ID="$Header$";
public static final boolean RCS_ID_RECORDED =
VersionInfo.recordClassVersion(RCS_ID, "%packagename%");
* Layout and page setup logic for a region.
* @param pageContext the current OA page context
* @param webBean the web bean corresponding to the region
public void processRequest(OAPageContext pageContext, OAWebBean webBean)
super.processRequest(pageContext, webBean);
* Procedure to handle form submissions for form elements in
* a region.
* @param pageContext the current OA page context
* @param webBean the web bean corresponding to the region
public void processFormRequest(OAPageContext pageContext, OAWebBean webBean)
super.processFormRequest(pageContext, webBean);
// Code Addition Started for CSV upload
OAApplicationModule am = (OAApplicationModule) pageContext.getApplicationModule(webBean);
OAViewObjectImpl vo = (OAViewObjectImpl) am.findViewObject("deptCsvVO1");
//if ("GoBtn".equals(pageContext.getParameter(EVENT_PARAM)))
if (pageContext.getParameter("GoBtn") != null)
System.out.println("Button Pressed");
DataObject fileUploadData =(DataObject)pageContext.getNamedDataObject("FileUploadItem");
String fileName = null;
String contentType = null;
Long fileSize = null;
Integer fileType = new Integer(6);
BlobDomain uploadedByteStream = null;
BufferedReader in = null;
try
fileName = (String)fileUploadData.selectValue(null, "UPLOAD_FILE_NAME");
contentType =(String)fileUploadData.selectValue(null, "UPLOAD_FILE_MIME_TYPE");
uploadedByteStream = (BlobDomain)fileUploadData.selectValue(null, fileName);
in = new BufferedReader(new InputStreamReader(uploadedByteStream.getBinaryStream()));
fileSize = new Long(uploadedByteStream.getLength());
System.out.println("fileSize"+fileSize);
catch(NullPointerException ex)
throw new OAException("Please Select a File to Upload", OAException.ERROR);
try{
//Open the CSV file for reading
String lineReader="";
long t =0;
String[] linetext;
while (((lineReader = in.readLine()) !=null) )
//Split the deliminated data and
if (lineReader.trim().length()>0)
System.out.println("lineReader"+lineReader.length());
linetext = lineReader.split(",");
t++;
//Print the current line being
if (!vo.isPreparedForExecution())
vo.setMaxFetchSize(0);
vo.executeQuery();
System.out.println("Trimmed "+ linetext[1].replace("\"", ""));
Row row = vo.createRow();
row.setAttribute("Deptno", linetext[0].trim());
row.setAttribute("Dname",linetext[1].trim().replace("\"", ""));
row.setAttribute("Loc",linetext[2].trim().replace("\"", ""));
//row.setAttribute("Column4", linetext[3].trim());
vo.last();
vo.next();
vo.insertRow(row);
catch (IOException e)
throw new OAException(e.getMessage(),OAException.ERROR);
//else if (pageContext.getParameter("Upload") != null)
am.getTransaction().commit();
throw new OAException("Uploaded SuccessFully",OAException.CONFIRMATION);
}Thanks,
Jit -
Combine CSV columns from multiple sources into one
Hi,
I am trying to import column from one CSV file and output it into additional columns on a Get-MailboxStatistics | Export-CSV command. The issue with the script I have created is that when use an Expression to create the new column it outputs all of the items
in the column from the import CSV into one row. See script below, I have highlighted the expressions that are not working.
Any assistance would be appreciated.
[Array]$Global:NotesMailSizeMaster = (Import-Csv c:\usermigration.csv).MailSizeMaster
[Array]$Global:NotesMailSizeRefresh = (Import-Csv c:\usermigration.csv).MailSizeRefresh
[Array]$Global:UserAlias = (Import-Csv c:\usermigration.csv).Alias
if (!(Get-PSSnapin |
Where-Object { $_.name -eq "Microsoft.Exchange.Management.PowerShell.Admin" }))
ADD-PSSnapin Microsoft.Exchange.Management.PowerShell.Admin
$Output = ForEach ($Alias in $UserAlias)
{ Get-MailboxStatistics $Alias | select-object DisplayName, ServerName, StorageGroupName, Database, @{ Name = "TotalItemSize"; expression = { $_.TotalItemSize.Value.ToMB() } },
@{ Name = "NotesMailSize(PreRefresh)"; expression = { $NotesMailSizeMaster } }, @{
Name = "NotesMailSize(PostRefresh)"; expression = { $NotesMailSizeRefresh }
$filename = (Get-Date).ToString("yyyyMMdd")
$Output | export-csv -Path c:\"$filename-mailboxstats.csv" -NoTypeInformationThere's a lot wrong with this script:
1. You're importing the same file 3x; you should import one and reference the things you need out of it.
2. If the CSV has more than one row in it, your .notation you're using to access the MailSizeMaster, MailSizeRefresh, and Alias properties is used incorrectly. If you are trying to reference each of those properties from an array of different entries,
the proper way would be:
$Import = @(Import-Csv C:\usermigration.csv)
$MailSizeMaster = $Import | Select-Object MailSizeMaster
OR
$MailSizeMaster = $Import | Select-Object -Expand MailSizeMaster
...the "OR" is not PowerShell.
3. I don't know if you can concatenate different objects with the select-object cmdlet like you are. However, what you could do is create a custom object to load the values into and then output that object:
$filename = (Get-Date).ToString("yyyyMMdd")
foreach ($item in $Import) {
$MS = Get-MailboxStatistics $item.Alias
New-Object PSObject -Property @{
'DisplayName' = $MS.DisplayName
'ServerName' = $MS.ServerName
'StorageGroupName' = $MS.StorageGroup
'Database' = $MS.Database
'NotesMailSizePre' = $item.MailSizeMaster
'NotesMailSizePost' = $item.MailSizeRefresh
} | Select-Object DisplayName,ServerName,StorageGroupName,Database,NotesMailSizePre,NotesMailSizePost | Export-Csv -NoTypeInformation "C:\$($filename)-mailboxstats.csv" -
How to configure sync rules involving a CSV file and portal self service
Hello,
I need to configure some FIM sync rules for the following scenario:
User account details are entered from a HR CSV file and exported to AD Users have the ability to modify their own AD attributes in the
FIM portal (there is not a requirement for them to view their HR CSV data in the portal). The FIM portal modifications will be exported to AD as expected.
My setup is as follows:
CSV file - name, last name, employee ID, address.
CSV MA - has direct attribute flows configured in the MA between the data source and MV Portal self service attributes –
users can edit mobile, display name and photo
I've also set the CSV MA as precedent for the attributes
FIM MA – attribute flows defined for MV to Data Source as usual (i.e. firstname to firstname, accountname to accountname, etc).
AD MA – no attribute flows defined as inbound and outbound sync rules have been configured in the portal using the Set\MPR\Triple.
I’m thinking of using the following run profiles:
CSV MA – full import and delta sync (imports HR data)
FIM MA – export and delta import (imports portal changes)
FIM MA – delta sync (syncs any portal changes)
AD MA – export and delta import
If my understanding is correct this should sync HR data from CSV to AD, as well as user attribute self service updates from the portal to AD.
If I wanted to just do a HR CSV sync could I get away with just steps 1 & 4 ? (presumably not as my rules are in the FIM portal?)
If I wanted to do just a portal sync, could I get away steps 2-4?
Any advice on how to improve my setup is much appreciated - cheers
IT Support/EverythingThe truth is that your design should be done in the way that it doesn't matter which profiles in which order you will execute. At the end, if you will run all import, synch and export profiles on each data source you should get same result. This is beauty
of synch engine here.
Your steps from 1-4 will synch data to your data sources and at the end will give you expected result. But not because of the order you are executing them but because of correct attribute flows. If flows from CSV file and from FIM portal might be done for
the same attributes you need to think also about attribute precedence.
Tomek Onyszko, memberOf Predica FIM Team (http://www.predica.pl), IdAM knowledge provider @ http://blog.predica.pl -
Loop through a csv file and return the number of rows in it?
What would be simplest way to loop through a csv file and
return the number of rows in it?
<cffile action="read" file="#filename#" output="#csvstr#"
>
<LOOP THROUGH AND COUNT ROWS>ListLen(). Use chr(13) as your delimiter
-
Read a csv file and read the fiscal yr in the 4th pos?
Hello ABAP Experts,
how to write a code for read a csv file and read the fiscal year in the 4th position.
any suggestions or code highly appreciated.
Thanks,
BWerHi Bwer,
Declare table itab with the required fields...
Use GUI UPLOAD to get the contents of the file (say abc.csv) in case if the file is on the presentation server...
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
filename = 'c:\abc.csv'
FILETYPE = 'ASC'
WRITE_FIELD_SEPARATOR = 'X'
tables
data_tab = itab
EXCEPTIONS
FILE_WRITE_ERROR = 1
NO_BATCH = 2
OTHERS = 22
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
Use OPEN DATASET in case if the file is on the application server..
After that USE SPLIT command at comma to get the contents of the 4th field...
Regards,
Tanveer.
<b>Please mark helpful answers</b> -
Performance issue with big CSV files as data source
Hi,
We are creating crystal reports for a large banking corporation with CSV files as data source. For some reports, we need join 2 csv files. The problem we met now is that when the 2 csv files are very large (both >200M), the performance is very bad and it takes an hour or so to refresh the data in Crystal Reports designer. The same case for either CR 11.5 or CR 2008.
And my question is, is there any way to improve performance in such situations? For example, can we create index on the csv files? If you have ever created reports connecting to CSV, your suggestions will be highly appreciated.
Thanks,
RayCertainly a reasonable concern...
The question at this point is, How are the reports going to be used and deployed once they are in production?
I'd look at it from that direction.
For example... They may be able to dump the data directly to another database on a separate server that would insulate the main enterprise server. This would allow the main server to run the necessary queries during off peak hours and would isolate any reporting activity to a "reporting database".
This would also keep the data secure and encrypted (it would continue to enjoy the security provided by an RDBMS). Text & csv files can be copied, emailed, altered & deleted by anyone who sees them. Placing them in encrypted .zip folders prevents them from being read by external applications.
<Hope you liked the sales pitch I wrote for you to give to the client... =^)
If all else fails and you're stuck using the csv files, at least see if they can get it all on one file. Joining the 2 files is killing you performance wise... More so than using 1 massive file.
Jason
Maybe you are looking for
-
Latest nokia music converting files gone?
I updated Nokia Music a couple of weeks ago. IN this new version I can't find the option to automatically convert mp3 to aac before transferring them to the phone. The older version did all that automatically. I don't understand how a "newer" versio
-
Webgate installation - not able to find .cgi file in bin directory
Hi friends, I am trying to install webgate on ohs server (after webpass installation). But as told in installation document for webgate, we need to configure the webgate by giving relative path to dll or cgi file in bin directory. But when I tried to
-
How to delete sender's old email address
Using Mail, Smart addresses picks the old email address even though I've edited Address Book and put in the new one.
-
How can I programically make a top level VI panel invisible?
How can I programically make a top level VI panel invisible?
-
Weblogic Data source list in adf
Hi Gurus, I am using jdev 11.1.2.3 and weblogic 10.3.6. I want to get list of all data sources available in the weblogic from adf. Thanks in advance