如何允许多个线程读写文件而不会出现锁定问题
我有一个 Web 应用程序,用于在每次页面加载时检查 config.json 文件的修改情况。它检查文件的修改日期并将其与记录的上次处理时间进行比较。如果不同,它会继续允许读取文件内容、处理这些内容并更新 config.json 以及写入单独的文件。我想确保多个同时连接读取和写入这两个文件不会导致问题。
var lastWriteTime = File.GetLastWriteTimeUtc(ConfigJsonPath);
if (CacheTime != lastWriteTime)
{
var config = new ConfigWriter().ReadData(ConfigJsonPath);
var model = JsonConvert.DeserializeObject<StyleModel>(config);
// This method writes to a another file
ProcessConfig(model, page);
var serialized = JsonConvert.SerializeObject(model, Formatting.Indented);
new ConfigWriter().WriteData(serialized, ConfigJsonPath);
CacheTime = File.GetLastWriteTimeUtc(ConfigJsonPath);
}
ConfigWriter 类
public class ConfigWriter
{
private static readonly ReaderWriterLockSlim Lock = new ReaderWriterLockSlim();
public void WriteData(string data, string path)
{
Lock.EnterWriteLock();
try
{
using (var fs = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Write))
{
fs.SetLength(0);
var dataAsByteArray = new UTF8Encoding(true).GetBytes(data);
fs.Write(dataAsByteArray , 0, dataAsByteArray .Length);
}
}
finally
{
Lock.ExitWriteLock();
}
}
public string ReadData(string filePath)
{
Lock.EnterReadLock();
try
{
string config;
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
using (var r = new StreamReader(fs, Encoding.UTF8))
{
config = r.ReadToEnd();
}
}
return config;
}
finally
{
Lock.ExitReadLock();
}
}
}
因此,最终有一个文件被读取,两个文件被更改(写入),并且我在写入时使用相同的锁。
我在这里做得太过分了吗?我从一个简单的文件流开始,将 FileShare 设置为读/写,然后我变得偏执并怀疑自己。有没有更好的方法来实现这个?
I have a web application that checks for the modification of a config.json file on each page load. It checks the modified date of the file and compares it against the last processing time that was recorded. If it differs, it proceeds to allow the reading of the file contents, processing those contents and updating config.json as well as writing to a separate file. I want to ensure that multiple simultaneous connections both reading and writing to these two files won't cause an issue.
var lastWriteTime = File.GetLastWriteTimeUtc(ConfigJsonPath);
if (CacheTime != lastWriteTime)
{
var config = new ConfigWriter().ReadData(ConfigJsonPath);
var model = JsonConvert.DeserializeObject<StyleModel>(config);
// This method writes to a another file
ProcessConfig(model, page);
var serialized = JsonConvert.SerializeObject(model, Formatting.Indented);
new ConfigWriter().WriteData(serialized, ConfigJsonPath);
CacheTime = File.GetLastWriteTimeUtc(ConfigJsonPath);
}
ConfigWriter Class
public class ConfigWriter
{
private static readonly ReaderWriterLockSlim Lock = new ReaderWriterLockSlim();
public void WriteData(string data, string path)
{
Lock.EnterWriteLock();
try
{
using (var fs = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Write))
{
fs.SetLength(0);
var dataAsByteArray = new UTF8Encoding(true).GetBytes(data);
fs.Write(dataAsByteArray , 0, dataAsByteArray .Length);
}
}
finally
{
Lock.ExitWriteLock();
}
}
public string ReadData(string filePath)
{
Lock.EnterReadLock();
try
{
string config;
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
using (var r = new StreamReader(fs, Encoding.UTF8))
{
config = r.ReadToEnd();
}
}
return config;
}
finally
{
Lock.ExitReadLock();
}
}
}
So in the end there is one file being read, and two files being altered (write), and I'm using the same lock on the write.
Did I overdo it here? I started with a simple filestream with FileShare set to read/write, and then I got paranoid and second guessing myself. Is there a better way to implement this?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论