Mysql - 归档数据,使用哪种解决方案?
我使用的是 MySQL 5.x,有一个主表包含所有客户的统计数据。我想对其运行报告,但不想在同一个表上运行它,因此我每天晚上都在考虑将数据移动到仅用于报告的不同表。我的问题是,为每个客户提供一个单独的表来存档是否有益,还是只为所有客户提供一个存档表?系统中可能有数千个客户,如果我决定按客户分解它,这可能意味着数千个存档表。你的想法?
I am using MySQL 5.x and there is one main table that has the stats combined for all customers. I would like to run reporting on it but dont want to run it on the same table so Im thinking of every night moving the data to a different table that will only be used for reporting. My question is would it be beneficial to have a seperate table for each customer to archive too or just have it just one archive table for all customers? There could be thousands of customers in the system which could mean thousands of archive tables if I decide to break it up by customer. Your thoughts?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
如果您为每个客户使用单独的表,表的数量将会增加。
如果您有统计数据,那么我建议将其总结如下,
如果您希望将整个数据放在一起,则需要合并三个表格。
其他方式,使用 mysql 复制和 master 进行插入、更新、删除,使用 Slave 进行选择
If you use the individual for each customer, tables will grow in number.
If you have statistics data then i suggest to summarize it like below
If you want the whole thing together you need to combine three tables.
other way, use mysql replication and master for insert,update,delete and slave for select
我认为这都是组织数据的一种方式;如果您正在处理诸如客户和订单之类的事情,那么无论如何它们都应该是单独的表。
根据可用的资源,您可能可以执行类似的操作,
您将获得您想要处理报告的表格的完美副本。如果您的资源有限,您可能可以对其进行批处理,如下所示:
...在下一次迭代中:
就归档信息而言,这取决于您正在工作的环境。例如,在我的东西的环境中工作时间为 24/7,需求量极高。我有一个临时数据引擎,它将所有数据保存在 RAM 数据库中,当一条记录被视为“已关闭”时,它会被移至 MySQL 数据库中进行存档。这样,当需要时,我可以在 MySQL 数据库上运行任何报告,而不会影响实时数据的操作运行时间。
我可以提供一些有关如何归档数据的想法,但这需要您对正在处理的数据量和需求类型进行描述。
I see this as all as a way of organizing your data; if you're dealing with things like customers, and orders, they should be separate tables, anyways.
Depending on the resources available, you could probably do something like
Where you would have a perfect copy of the table you would like to get the report to work on. If your resources are limited, you could probably batch it, like so:
...and on the next iteration:
As far as archiving the information goes, it depends on the environment you are working in. For example, in the environment my stuff works in, it's 24/7 and extremely high-demand. I have an interim data engine which keeps all the data in an in-RAM database, and when a record is seen as "closed", it is moved into a MySQL database for archiving. This way, when the need be, I can run whatever reports on the MySQL database, without impacting the operational run-time of the live data.
I can offer some ideas on how to archive your data, but that would require you to put forth a description of the type of volume and demand you're dealing with.