sql批量复制不会使用文件流将数据上传到数据库

发布于 2025-01-08 09:32:35 字数 2109 浏览 0 评论 0原文

我通过文件上传控件上传文件,然后流读取器读取文件而不是数据表,然后 sql 批量复制将数据表复制到我的 sql 数据库并填充相应的列。有人看出下面的代码有什么问题吗?我没有收到错误消息,但它似乎挂在 IIS 进程上。我无法从文件夹中删除 csv 文件,因为它说该过程仍在运行。

    protected void btnUpload_Click(object sender, EventArgs e)
    {

        //upload file to the gencouploadfiles folder
        UploadFile();

        //fetch CSV file from the folder
        string strFilePath = Server.MapPath("GencoUploadFiles") + "\\" + "GencoUploadFile.txt";

        //perform sql bulk copy
        PerformBulkCopy(GencoUpload(strFilePath));

        //delete the file from the folder
    }



    public void UploadFile()
    {
        if (fileUpload1.HasFile)
        {
            FileInfo fileinfo = new FileInfo(fileUpload1.PostedFile.FileName);
            string strCsvFilePath = Server.MapPath("GencoUploadFiles") + "\\" + "GencoUploadFile.txt";
            fileUpload1.SaveAs(strCsvFilePath);
        }
    }


    public static DataTable GencoUpload(string filepath)
    {
        StreamReader sr = new StreamReader(filepath);
        string line = sr.ReadLine();
        string[] value = line.Split('|');
        DataTable dt = new DataTable();
        DataRow row;

        foreach (string dc in value)
        {
            dt.Columns.Add(new DataColumn(dc));
        }

        while (!sr.EndOfStream)
        {
            value = sr.ReadLine().Split('|');
            if (value.Length == dt.Columns.Count)
            {
                row = dt.NewRow();
                row.ItemArray = value;
                dt.Rows.Add(row);
            }
        }
        return dt;
    }


    public void PerformBulkCopy(DataTable dt)
    {
        SqlConnection conStr = new SqlConnection(ConfigurationManager.ConnectionStrings["EDI"].ConnectionString);

        using (SqlBulkCopy bulkcopy = new SqlBulkCopy(conStr.ConnectionString))
        {
            bulkcopy.DestinationTableName = "dbo.GencoUploadTempTable";
            bulkcopy.BatchSize = dt.Rows.Count;
            conStr.Open();
            bulkcopy.WriteToServer(dt);
            bulkcopy.Close();
            conStr.Close();
        }
    }

I am uploading a file through a file upload control and then stream reader reads the file itno a datatable and then sql bulk copy copies the datatable to my sql database and fills the appropriate columns. Does anyone see anything wrong with the code below? I dont get an error message but it seems like it is getting hung up on the IIS process. I cant go in to delete the csv file from the folder because it says the process is still working.

    protected void btnUpload_Click(object sender, EventArgs e)
    {

        //upload file to the gencouploadfiles folder
        UploadFile();

        //fetch CSV file from the folder
        string strFilePath = Server.MapPath("GencoUploadFiles") + "\\" + "GencoUploadFile.txt";

        //perform sql bulk copy
        PerformBulkCopy(GencoUpload(strFilePath));

        //delete the file from the folder
    }



    public void UploadFile()
    {
        if (fileUpload1.HasFile)
        {
            FileInfo fileinfo = new FileInfo(fileUpload1.PostedFile.FileName);
            string strCsvFilePath = Server.MapPath("GencoUploadFiles") + "\\" + "GencoUploadFile.txt";
            fileUpload1.SaveAs(strCsvFilePath);
        }
    }


    public static DataTable GencoUpload(string filepath)
    {
        StreamReader sr = new StreamReader(filepath);
        string line = sr.ReadLine();
        string[] value = line.Split('|');
        DataTable dt = new DataTable();
        DataRow row;

        foreach (string dc in value)
        {
            dt.Columns.Add(new DataColumn(dc));
        }

        while (!sr.EndOfStream)
        {
            value = sr.ReadLine().Split('|');
            if (value.Length == dt.Columns.Count)
            {
                row = dt.NewRow();
                row.ItemArray = value;
                dt.Rows.Add(row);
            }
        }
        return dt;
    }


    public void PerformBulkCopy(DataTable dt)
    {
        SqlConnection conStr = new SqlConnection(ConfigurationManager.ConnectionStrings["EDI"].ConnectionString);

        using (SqlBulkCopy bulkcopy = new SqlBulkCopy(conStr.ConnectionString))
        {
            bulkcopy.DestinationTableName = "dbo.GencoUploadTempTable";
            bulkcopy.BatchSize = dt.Rows.Count;
            conStr.Open();
            bulkcopy.WriteToServer(dt);
            bulkcopy.Close();
            conStr.Close();
        }
    }

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

束缚m 2025-01-15 09:32:35

简化您的代码;有很多方法可以将 csv 放入数据表 - 请参阅 例如,如何将 CSV 文件读入 .NET 数据表。在上面的示例中,在读完流后,您似乎没有关闭流...在返回之前添加 sr.Close() ,或者更好地将声明包装在 using() 语句中:

  public static DataTable GencoUpload(string filepath) 
  { 
        DataTable dt = new DataTable(); 

        using(StreamReader sr = new StreamReader(filepath))
        { 
            string line = sr.ReadLine(); 
            string[] value = line.Split('|'); 

            DataRow row; 

            foreach (string dc in value) 
            { 
                dt.Columns.Add(new DataColumn(dc)); 
            } 

            while (!sr.EndOfStream) 
            { 
                value = sr.ReadLine().Split('|'); 
                if (value.Length == dt.Columns.Count) 
                { 
                    row = dt.NewRow(); 
                    row.ItemArray = value; 
                    dt.Rows.Add(row); 
                } 
            }
        }

        return dt; 
    } 

这应该可以防止文件被锁定。

接下来要注意的是检查数据表中是否确实有任何数据。放置一个断点并测试您的加载代码是否真正工作;您正在向数据表添加行,但尚未定义列结构(即您仅提供名称而不提供数据类型,因此可能会遇到转换问题)。使用其他方法之一来加载文件肯定会更容易:-)

Simplify your code; there are plenty of ways to get the csv into a datatable - see How to read a CSV file into a .NET Datatable for example. In your example above, you do not appear to be closing the stream after you have finished reading it... add sr.Close() before the return, or better still wrap the declaration in a using() statement:

  public static DataTable GencoUpload(string filepath) 
  { 
        DataTable dt = new DataTable(); 

        using(StreamReader sr = new StreamReader(filepath))
        { 
            string line = sr.ReadLine(); 
            string[] value = line.Split('|'); 

            DataRow row; 

            foreach (string dc in value) 
            { 
                dt.Columns.Add(new DataColumn(dc)); 
            } 

            while (!sr.EndOfStream) 
            { 
                value = sr.ReadLine().Split('|'); 
                if (value.Length == dt.Columns.Count) 
                { 
                    row = dt.NewRow(); 
                    row.ItemArray = value; 
                    dt.Rows.Add(row); 
                } 
            }
        }

        return dt; 
    } 

This should prevent the file from becoming locked.

Next thing to look at is to check if the data table actually has any data in it. Put a break point in and test to see if your load code is actually working; you are adding rows to a data table but you haven't defined the column structure (i.e. you've only supplied names not data types so you might get conversion issues). It's definitely going to be easier to use one of the other methods for loading your file :-)

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文