Upload CSV file to SQL server


Upload CSV file to SQL server
What is the best way to upload a large csv
data file into SQL server using C# ? The file contains about 30,000 rows and 25 columns.
csv
SQL Server doesn't have any facility to upload files to - you'll need to use some other means (e.g. a web service running on IIS or something like that). SQL Server can load a file from a disk it can reach - so you'll need find a way to put that file where SQL Server can read it from
– marc_s
Dec 24 '13 at 10:36
the requirement is such that we have an application through while we loop and upload files one by once into SQLserver so I need to use c# code to upload file.
– user3083221
Dec 24 '13 at 10:39
4 Answers
4
1st off, You don't need programming stuff. You can directly upload CSV files into SQL Database with SQL management tools. However, if you really need do it through programming, Just read below.
Personally, I think this approach is the most efficient and easiest way to do through programming.
In general, you can achieve it in two steps
1st step is to read the CSV file and hold the records as a DataTable
.
2nd step is store the retrieved DataTable
into SQL Database Table as a Bulk Entry
DataTable
DataTable
This is a function that returns CSV File Data as a DataTable
. Call and Keep it in the memory and you can do whatever you want with it.
DataTable
This function is going to return CSV Read file into DataTable.
private static DataTable GetDataTabletFromCSVFile(string csv_file_path)
{
DataTable csvData = new DataTable();
try
{
using(TextFieldParser csvReader = new TextFieldParser(csv_file_path))
{
csvReader.SetDelimiters(new string { "," });
csvReader.HasFieldsEnclosedInQuotes = true;
string colFields = csvReader.ReadFields();
foreach (string column in colFields)
{
DataColumn datecolumn = new DataColumn(column);
datecolumn.AllowDBNull = true;
csvData.Columns.Add(datecolumn);
}
while (!csvReader.EndOfData)
{
string fieldData = csvReader.ReadFields();
//Making empty value as null
for (int i = 0; i < fieldData.Length; i++)
{
if (fieldData[i] == "")
{
fieldData[i] = null;
}
}
csvData.Rows.Add(fieldData);
}
}
}
catch (Exception ex)
{
return null;
}
return csvData;
}
}
SQLBulkCopy - Use this function to insert the Retrieved DataTable into Sql Table
static void InsertDataIntoSQLServerUsingSQLBulkCopy(DataTable csvFileData)
{
using(SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=yourDB;Integrated Security=SSPI;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = "Your table name";
foreach (var column in csvFileData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(csvFileData);
}
}
Source
It should be noted that this will read the whole file into the computer's memory before even starting to upload it to the server. This might be an issue if you are uploading lots of big files. A better approach might be to create an array of rows for chunks of the file and call the public 'void WriteToServer(DataRow rows)' overload.
– Martin Brown
Oct 27 '14 at 12:52
@MartinBrown I think that would cause high traffic unnecessarily. Doing it one time as a bulk would be easy go. rolling back when there is an error is also would be easy.
– Kirk
Apr 7 '16 at 19:55
If you see
Microsoft.VisualBasic.FileIO does not exist
click this link.– Wikis
Jun 15 '16 at 14:12
Microsoft.VisualBasic.FileIO does not exist
Loading millions of data in datatable would lead to out of memory exception.Is there any othery way to do this task??
– Learning-Overthinker-Confused
Nov 14 '16 at 6:47
@Learning If you can notice there is a foreach loop in the code. you can set your own limiter upto 10,000 Rows or something similar.
– Kirk
Nov 14 '16 at 6:51
Use System.Data.SqlClient.SqlBulkCopy class to insert data into Sql tables.
To use that class you also need to convert CVS data to DataTable, see here one of the ways.
This sounds like a perfect job for SSIS. It's a free part of SQL Server, can loop through all the csv files in a folder, is very fast, and has excellent error handling and logging.
This technique uses the SQLBulkCopy() facility, but does not read the entire file into memory.
The trick is that it implements a IDataReader class to read the .csv file.
https://www.codeproject.com/Tips/1029831/Fast-and-Simple-IDataReader-Implementation-to-Read
By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.
why to use .net to upload the file while sql server itself support the import of CSV file??
– Furqan Hameedi
Dec 24 '13 at 10:36