多处理有更好的方法吗?
- 我正在使用 python 多重处理来处理文件。最后处理
文件记录存储在字典 A 中,即dict_A = {'file1_xx': '8-04-22', 'file2_xx': '8-04-22', 'file3_xx': '8-04-22' , 'file4_xx': '8-04-22'}
- 扫描文件目录,最后修改日期的文件名存储在字典测试。比较两个字典中记录的文件是否有新文件:即将每个文件的上次修改日期(即
file1_xx
)与dict_A
中的上次处理日期进行比较。如果文件的上次修改日期大于每个文件的上次处理日期,则有一个条件将更新dict_A
。 - 我面临问题,因为处理文件后字典没有更新。
- 理想情况下,
dict_A
应使用同一类别的每个文件的最新修改日期进行更新。然后通过 sqlalchemy 将这个 dict_A 上传到 PostgreSQL 数据库。
def compare_rec(i):
a = dict_A[i]
b = dict_test[i]
if a >= b:
print("none")
else:
lock.acquire()
print("found")
a = b
lock.release()
def init(l):
global lock
lock = l
if __name__ == '__main__':
file_cat=['a', 'b', 'c', 'd']
dict_A={'a': '10', 'b': '10', 'c': '10', 'd': '10'}
dict_test={'a': '11', 'b': '11', 'c': '11', 'd': '11'}
l = multiprocessing.Lock()
pool = multiprocessing.Pool(initializer=init, initargs=(l,))
pool.map(compare_rec, file_cat)
pool.close()
pool.join()
- I am using python multiprocessing to process files. Last processed
file record is stored in a dict A i.e.dict_A = {'file1_xx': '8-04-22', 'file2_xx': '8-04-22', 'file3_xx': '8-04-22', 'file4_xx': '8-04-22'}
- Files directory is scanned, filenames with last modified date are stored in
dict_test
. Files recorded in both dicts are compared for new files: i.e. compare each file last modified date i.efile1_xx
against the last processed date indict_A
. There's a condition which will update thedict_A
if the file last modified date is greater than last processed date per single file. - I am facing issues as the dictionary is not updated after the files are processed.
- Ideally the
dict_A
should be updated with the latest modified date per file of same category. Thisdict_A
is then uploaded to PostgreSQL db through sqlalchemy.
def compare_rec(i):
a = dict_A[i]
b = dict_test[i]
if a >= b:
print("none")
else:
lock.acquire()
print("found")
a = b
lock.release()
def init(l):
global lock
lock = l
if __name__ == '__main__':
file_cat=['a', 'b', 'c', 'd']
dict_A={'a': '10', 'b': '10', 'c': '10', 'd': '10'}
dict_test={'a': '11', 'b': '11', 'c': '11', 'd': '11'}
l = multiprocessing.Lock()
pool = multiprocessing.Pool(initializer=init, initargs=(l,))
pool.map(compare_rec, file_cat)
pool.close()
pool.join()
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
进程不共享变量。
在函数中,我将使用
return
将filename
和date
发送回主进程主线程应该从所有进程获取结果
,并且应该更新字典
完整代码:
Processes don't share variables.
In function I would use
return
to sendfilename
anddate
back to main processmain thread should get results from all processes
and it should update dictonary
Full code: