使用 MS SQL Server 运行 Asp.net 网站 - 我什么时候应该担心可扩展性?
我在 ASP.net 平台上运行一个中型网站,并使用 MS SQL 服务器来存储数据。
我当前的网站统计数据是:
每天约 6000 次页面浏览 SQL Server 中有大约 10 个表,每个表大约有 1000 行 每页提供 ~ 4 个查询 主机有 1GB RAM,
我预计到 2009 年底将达到约
20,000 次页面浏览量 ~ 10 个表,每个表约 4000 行 每页提供 ~ 5 个查询
我的问题是我现在应该计划可扩展性吗? 这台机器能否按照预期的统计数据坚持到今年年底?
我知道我的描述非常顶层,并没有提供对查询类型等的深入了解。但只是想知道您的直觉告诉您什么?
谢谢!
I run a medium sized website on an ASP.net platform and using MS SQL server to store the data.
My current site stats are:
~ 6000 Page Views a day
~ 10 tables in the SQL server with around 1000 rows per table
~ 4 queries per page served
The hosting machine has 1GB RAM
I expect by the end of 2009 to hit around:
~ 20,000 page views
~ 10 tables and around 4000 rows per table
~ 5 queries per page served
My question is should I plan for scalability right now itself? Will the machine hold up till the end of the year with the expects stats.
I know my description is very top level and does not provide insight into the kind of queries etc. But just wanted to know what your gut instinct tells you?
Thanks!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(6)
您应该始终规划可扩展性。 何时投入资源进行实际扩展通常是一个艰难的猜测。
最佳信息太少,无法回答这个问题。 如果由于通过每页四个查询与遗留企业应用程序进行大量交互而导致页面请求需要 30 CPU 秒来处理,那么就没有办法了。 如果只需要几分之一秒的时间来提供缓存中存储的一些静态内容,并且您的查询仅每半小时执行一次来刷新内容 - 那么按照您描述的流量增长率,您在 2020 年之前都可以正常工作。
我的猜测是你更接近后一种情况。 每天 20,000 次页面点击量并不是真正的大量流量,但您需要在某个时候对页面和服务器性能进行基准测试,以便可以进行所需的计算。
扩展站点时需要注意的事项:
两年前,我看到一台相对较新的(两年前)笔记本电脑运行 IIS 并提供服务每秒最多 1100 到 1200 个简单动态页面请求。 它是由一家咨询公司建立的,该公司的业务是优化 ASP.Net 网站,但它会向您展示您可以做多少事情。
You should always plan for scalability. When to put resources into doing the actual scaling is usually the tough guess.
Way too little information to answer this. If a page request takes 30 CPU seconds to process due to massive interaction with a legacy enterprise application through the four queries per page - then there's no way. If it's taking miniscule fractions of a second to serve some static content stored in the cache and your queries are only executed every half hour to refresh the content - then you're good until 2020 at the rate of traffic growth you describe.
My guess is that you're somewhere closer to the latter scenario. 20,000 page hits a day is not really a ton of traffic, but you'll need to benchmark your page and server performance at some point so that you can make the calculations you need.
Things to look at for scaling your site when it is time:
Two years ago I saw a relatively new (for two years ago) laptop running IIS and serving up 1100 to 1200 simple dynamic page requests per second. It had been set up by a consulting firm whose business was optimizing ASP.Net websites, but it goes to show you how much you can do.
本质上,到 2009 年底,您预计每天会执行 100,000 次 SQL 查询。 这大约是每秒 1.157 次查询。
我假设您的配置是“正常”的(即您没有做一些奇怪的事情,这些都是非常简单的选择、更新、插入等),并且您的服务器正在运行 RAID 磁盘。
对于每个表 4,000 行,这对于 SQL Server 来说不算什么。 你应该没问题。 如果您想积极主动,请在服务器中再放置一块 RAM,并将其增加到至少 2GB,这样 IIS 和 SQL 就有足够的内存(SQL 肯定会利用它)。
Essentially, by the end of 2009, you expect to do 100,000 SQL queries per day. This is about 1.157 queries per second.
I am making the assumption that your configuration is "normal" (i.e. you're not doing something funky and these are pretty straightforward SELECT, UPDATE, INSERT, etc), and that your server is running RAID disks.
At 4,000 rows per table this is nothing to SQL server. You should be just fine. If you wanted to be proactive about it, put another stick of RAM in the server and bring it up to at least 2GB, that way IIS and SQL have plenty of memory (SQL will certainly take advantage of it).
主机? 这是否意味着您将 IIS 和 SQL 安装在同一个机器上,或者将 IIS 安装在主机上并使用托管公司提供的专用 SQL Server? 无论哪种方式,我建议开始考虑如何实现缓存层以最大限度地减少对数据库的命中(如果可能)。 一旦计划好(不一定要实现),我就会开始研究如何围绕输出(在 ASP.NET 中构建的东西)构建一个缓存层。 如果您看到构建缓存层的清晰简单路径...那么这是开始最小化对数据库的请求并在 Web 服务器上工作的快速且简单的方法。 我建议这个缓存层要灵活……不要使用.NET 提供的任何东西! 目前我仍然建议使用MemCached Win32。 您可以轻松地将其安装在一台托管的本地机器上,并配置缓存层以使用本地资源(添加内存...1GB 是不够的)。 然后,如果您发现确实需要充分利用系统的每一点性能……那就花点钱购买第二个盒子。 在当前盒子和新盒子之间拆分缓存(允许您在缓存中保留更多内容)。 这会给你一些成长的空间(和时间)。 卸载到更多缓存应该有助于解决未来的任何峰值......并且使用第二个框,您现在还可以专注于使您的网站在农场环境中工作。 如果您使用本地会话..将其推入缓存层,以便来自一个或另一个盒子的请求无关紧要(标准会话对于其管理的盒子来说是本地的)。
这是一个很大的主题......所以没有真正的细节,这当然都是猜测! 您可能适合在现有安装中添加更好、更多的硬件。
The hosting machine? Does this mean that you have IIS and SQL installed on the same box or IIS on your host machine with a dedicated SQL Server provided by your hosting company? Either way I would suggest starting to take a look at how you might implement a caching layer to minimize the hits (where possible) to the database. Once this is PLANNED (not necessarily implemented) I would then start to look at how you might build a caching layer around your output (things built in ASP.NET). If you see a clear an easy path to building caching layers...then this is a quick and easy way to start to minimize request to the database and work on your web server. I suggest that this cache layer be flexible...read not use anything provided by .NET! Currently I still suggest using MemCached Win32. You can install it on your one hosted local box easily and configure your cache layer to use local resources (add memory...1gb is not enough). Then if you find that you really need to squeeze every little bit of performance out of your system...splurge for a second box. Split your cache between your current box...and the new box (allowing you to keep more in cache). This will give you some room (and time) to grow. Offloading to more cache should help address any future spikes...and with the second box you can now also focus on making your site work in farmed environment. If you are using local session..push that into your cache layer so that a request from one box or another won't matter (standard session is local to the box that it is managed on).
This is a huge subject...so without real details this is all speculation of course! You might be just right for adding better and more hardware to the existing installation.
您是否尝试过使用示例数据设置快速性能测试? 20,000 次页面浏览量不到 1/秒(假设均匀分布超过 8 小时),考虑到您的表很小,这是相当小的。 假设您没有在每个页面视图中发送大量数据(即包含其中一个表中所有 1000 行的数据表),那么您可能没问题。
您可能需要增加 RAM,但除了运行性能测试之外,我现在不会太担心性能。
Have you tried setting up a quick performance test using sample data? 20,000 page views is less than one/sec (assuming even distribution over 8 hours), which is pretty minimal given your small tables. Assuming you're not sending a ton of data with each page view (i.e. a data table with all 1000 rows from one of your tables), you are likely OK.
You may need to increase RAM, but other than running a performance test I wouldn't worry too much about performance right now.
我认为您所描述的负载对于大多数机器来说不会是太大的问题。 当然,它不仅仅取决于您概述的几个指标,还取决于查询复杂性、页面大小和许多其他因素。
如果您担心可扩展性,请进行一些负载测试并查看您的站点如何处理,例如每小时 10000 次页面浏览(大约每秒 3 次浏览)。 只要您为可能出现的情况做好计划,提前计划通常都是好的。
I don't think the load you are describing would be too much of a problem for most machines. Of course it doesn't just depend on the few metrics you outlined but also on query complexity, page size, and a heap of other things.
If you worry about scalability do some load testing and see how your site handles, say 10000 page views per hour (about 3 views per second). It's mostly always good to plan ahead as long as you plan for probable scenarios.
大胆地说:给定 10 个表,每个表有 4,000 行,假设每行大约 2KB 数据,整个数据库只有 80MB。 轻松缓存在可用内存中。 假设应用程序的其他所有内容都同样简单,您应该能够轻松地每秒提供数百个页面。
工程师说:如果你想知道,就对你的应用程序进行压力测试。
Guts say: Given 10 tables with 4,000 rows each and assuming about 2KB of data per row is only 80MB for the entire database. Easily cached within memory available. Assuming everything else about the application is equally simple, you should be able to easily serve hundreds of pages per second.
Engineers say: If you want to know, stress test your application.