NodeJs性能问题

发布于 2024-11-05 13:03:42 字数 4376 浏览 3 评论 0原文

我正在使用 NodeJs 构建一个实时统计应用程序。对于原型,我在 RackSpace 服务器中使用四核 AMD Opteron,使用 Cluster NodeJs ( http://learnboost.github.com/cluster/ )和使用本机 Nodejs 驱动程序的 MongoDb。

基本上,我在我的公司项目中插入了一段 JS 代码,为一堆客户的网站提供内容。这段代码每 10 秒“ping”一次我的服务器,调用图像并传递我在服务器端获取的参数,然后将其插入(或更新)到 MongoDb 集合中。在一天中的“缓慢”时间里,我每次都会获得大约 3000 个连接(我在终端上使用 netstat -natp 命令获得这些连接),这使我的集群使用每个核心的大约 25%(我使用“top”命令获得这些连接) )。但在“繁忙”的时间里,我每次都会收到大约 7000 个以上的连接,这让我的集群变得疯狂(每个核心的使用率约为 80% 以上),而且似乎随着时间的推移,节点性能下降。 这是正常的吗?或者 Nodejs 应该以更“简单”的方式处理这些点击?如果我使用Mongoose,性能可以提高吗?

如果你对 MongoDb 感到好奇,它使用了大约 4% 的一个核心,这对我来说很好(如果不放置索引,使用率大约为 50%+,但至少索引解决了这个性能问题)。

非常感谢您的耐心等待, 干杯。

编辑:

使插入的代码如下所示: db.open(函数(err, db) { });

return connect.router(function(app){
    app.get("/pingserver/:clientid/:event/:cachecontrol", function(req, res, next){
    event:'+req.params.event + ', cachecontrol:' + req.params.cachecontrol);
        var timestamp = new Date(); 
          switch(req.params.event) {
          case 'load':
              var params = url.parse(req.url, true).query;

              db.collection('clientsessions', function(err, collection)         {
                try {

                    var client = {
                        id: req.params.clientid,
                        state: req.params.event + 'ed',
                        loadTime: timestamp.getTime(),
                        lastEvent: req.params.event,
                        lastEventTime: timestamp.getTime(),
                        lastEventDate: timestamp.toString(),
                        events: [{
                            event: req.params.event,
                            timestamp: timestamp.getTime(),
                            date: timestamp.toString()
                        }],
                        media: {
                            id: params.media.split('|')[0] || null,
                            title: unescape(params.media.split('|')[1]) || null
                        },
                        project: {
                            id: params.project.split('|')[0] || null,
                            name: unescape(params.project.split('|')[1]) || null
                        },
                        origin: req.headers['referer'] || req.headers['referrer'] || '',
                        userAgent: req.headers['user-agent'] || null,
                        userIp: req.socket && (req.socket.remoteAddress || (req.socket.socket && req.socket.socket.remoteAddress)),
                        returningUser: false
                    };
                }catch(e) {console.log(e);}       
                 collection.insert(client, function(err, doc) {
                 });
              });
              break;

          case 'ping':
              db.collection('clientsessions', function(err, collection) {
                  collection.update({id: req.params.clientid}, { 
                                                     $set : { lastEvent: req.params.event 
                                                             ,lastEventTime: timestamp.getTime(),lastEventDate: timestamp.toString()}
                                                   }, {}, function(err, doc) {});
              });
              break;

          default:
              db.collection('clientsessions', function(err, collection) {
                  collection.update({id: req.params.clientid}, { 
                                                     $set : {state: req.params.event+'ed'
                                                            , lastEvent: req.params.event 
                                                            , lastEventTime: timestamp.getTime()}
                                                   , $push : { events : { event: req.params.event, timestamp: timestamp.getTime(), date: timestamp.toString() } } }, {}, function(err, doc) {});
              });

              break;
          }

          if (!transparent) {
              console.log('!transparent');
              transparent = fs.readFileSync(__dirname + '/../../public/images/transparent.gif', 'binary');
          }
          res.setHeader('Content-Type', 'image/gif');
          res.setHeader('Content-Length', transparent.length);

          res.end(transparent, 'binary');
      });
});

I'm building a realtime stats application using NodeJs. For the prototype I'm using a Quad-Core AMD Opteron in a RackSpace server for the test with a nodejs server using the Cluster NodeJs ( http://learnboost.github.com/cluster/ ) and the MongoDb using the native nodejs driver.

Basically I've inserted a JS code in my company project that delivers content for a bunch of client's websites. This code "pings" my server each 10seconds calling for a image and passing parameters that I get in the server-side and insert ( or update ) in a MongoDb collection. In a "slow" time of the day I get about 3000 connections ( I get these using the netstat -natp command on terminal) each time that makes my cluster use about 25% of each core ( I get these using the "top" command ). But in a "busy" hour I get about 7000+ connections each time what makes my cluster go crazy ( about 80%+ use of each core ), and it seems that as the time goes by, the node degrades.
Is this normal? Or should Nodejs handle these hits in a more "easy" way? If I use Mongoose, the performance can increase?

In case you are curious about the MongoDb it uses about 4% of one core, which is fine by me ( without putting a index the use was about 50%+ but, at least, the index solved this performance problem ).

Thanks a lot for the patience,
Cheers.

Edit:

The code that makes the insert looks like this:
db.open(function(err, db) { });

return connect.router(function(app){
    app.get("/pingserver/:clientid/:event/:cachecontrol", function(req, res, next){
    event:'+req.params.event + ', cachecontrol:' + req.params.cachecontrol);
        var timestamp = new Date(); 
          switch(req.params.event) {
          case 'load':
              var params = url.parse(req.url, true).query;

              db.collection('clientsessions', function(err, collection)         {
                try {

                    var client = {
                        id: req.params.clientid,
                        state: req.params.event + 'ed',
                        loadTime: timestamp.getTime(),
                        lastEvent: req.params.event,
                        lastEventTime: timestamp.getTime(),
                        lastEventDate: timestamp.toString(),
                        events: [{
                            event: req.params.event,
                            timestamp: timestamp.getTime(),
                            date: timestamp.toString()
                        }],
                        media: {
                            id: params.media.split('|')[0] || null,
                            title: unescape(params.media.split('|')[1]) || null
                        },
                        project: {
                            id: params.project.split('|')[0] || null,
                            name: unescape(params.project.split('|')[1]) || null
                        },
                        origin: req.headers['referer'] || req.headers['referrer'] || '',
                        userAgent: req.headers['user-agent'] || null,
                        userIp: req.socket && (req.socket.remoteAddress || (req.socket.socket && req.socket.socket.remoteAddress)),
                        returningUser: false
                    };
                }catch(e) {console.log(e);}       
                 collection.insert(client, function(err, doc) {
                 });
              });
              break;

          case 'ping':
              db.collection('clientsessions', function(err, collection) {
                  collection.update({id: req.params.clientid}, { 
                                                     $set : { lastEvent: req.params.event 
                                                             ,lastEventTime: timestamp.getTime(),lastEventDate: timestamp.toString()}
                                                   }, {}, function(err, doc) {});
              });
              break;

          default:
              db.collection('clientsessions', function(err, collection) {
                  collection.update({id: req.params.clientid}, { 
                                                     $set : {state: req.params.event+'ed'
                                                            , lastEvent: req.params.event 
                                                            , lastEventTime: timestamp.getTime()}
                                                   , $push : { events : { event: req.params.event, timestamp: timestamp.getTime(), date: timestamp.toString() } } }, {}, function(err, doc) {});
              });

              break;
          }

          if (!transparent) {
              console.log('!transparent');
              transparent = fs.readFileSync(__dirname + '/../../public/images/transparent.gif', 'binary');
          }
          res.setHeader('Content-Type', 'image/gif');
          res.setHeader('Content-Length', transparent.length);

          res.end(transparent, 'binary');
      });
});

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(6

听不够的曲调 2024-11-12 13:03:42

这正常吗?

取决于,连接会自行消失吗?他们只是继续建造吗?您是在谈论“Web 连接”(http) 还是 MongoDB 连接?

mongod 日志说什么? node 日志说什么?

您每秒收到多少个请求?

或者 Nodejs 应该以更“简单”的方式处理这些点击?

如果不知道代码在做什么,很难说。

您预计该盒子能同时处理多少个连接?

如果我使用Mongoose,性能可以提高吗?

所以 Mongoose 实际上是一个围绕 node-mongodb-native 驱动程序的对象包装器。它不是一个不同的驱动程序,它只是一个包装器。

包装器将向您已有的代码添加代码。如果您遇到代码问题,那么添加代码并不能保证问题会得到改善。如果 mongoose 确实解决了你的问题,那么它正在用你没有的连接做一些事情。如果是这样的话,你不一定需要 Mongoose,你只需要更好的连接管理。


看看你的问题有很多潜在的来源。

解决这个问题的唯一方法就是分解各个部分并深入挖掘更多细节。开始的地方:
- 与 MongoDB 的连接是否正确关闭(查看数据库日志)?
- 日志是否包含任何其他错误?
- 对节点日志做同样的事情吗?
- 你有关于内存使用情况的图表吗?谁占用的内存最多?
- 当每个核心的使用率达到 80% 时,哪个进程正在执行此操作? mongod节点?还有什么?

为了真正为您提供帮助,我们需要更多有关系统运行情况的数据。

Is this normal?

Depends, are connections going away on their own? Do they just keep building? Are you talking about "web connection" (http) or MongoDB connection?

What do the mongod logs say? What do the node logs say?

How many requests are you getting per second?

Or should Nodejs handle these hits in a more "easy" way?

Hard to say without knowing what the code is doing.

How many simultaneous connections do you expect the box to handle?

If I use Mongoose, the performance can increase?

So Mongoose is actually an object wrapper around node-mongodb-native driver. It is not a different driver, it's just a wrapper.

The wrapper going to add code to the code you already have. If you have a code problem, then adding code is not guaranteed to make the problem better. If mongoose does solve your problem, then it's doing something with connections that you're not. If that the case, you don't necessarily need Mongoose, you just need better connection management.


Look there are lots of potential sources for you issue.

The only way to get this solved is to break out the pieces and dig in with much more detail. Places to start:
- are connections to MongoDB closing correctly (look at the db logs)?
- do the logs contain any other errors?
- do the same thing for the node logs?
- do you have graphs regarding the memory usage? who's taking up the most memory?
- when you get to 80% of each core, which process is doing this? mongod? node? something else?

To really help you out here, we need a lot more data about what's going on with the system.

祁梦 2024-11-12 13:03:42

连续的请求可能会非常昂贵,特别是如果它们之间的超时时间很短。在您的情况下,您每秒接受约 300-700+ 个并发请求,您的系统负载可能取决于您正在处理的内容。您可以尝试切换到 Mongoose,但是如果它适用于您的场景,我宁愿查看图像处理和缓存,因为 DB 似乎不是您的瓶颈(尽管 DB 驱动程序也可能是问题)。

Continuous requests can be quite expensive especially if the timeout between them is small. In your case you are accepting between ~300-700+ concurrent requests per second and your system load can depend namely on what are you processing. You can try to switch to Mongoose, however I would rather look at the image handling and caching if it is applicable for your scenario, since DB seems not to be your bottleneck (although DB driver may also be the issue).

葬花如无物 2024-11-12 13:03:42
if (!transparent) {
          console.log('!transparent');
          transparent = fs.readFileSync(__dirname + '/../../public/images/transparent.gif', 'binary');
      }

透明的情况有多少是假的?我在代码中没有看到它的定义。您可能会阻塞同步磁盘 IO 上的整个 Node 进程,可能会阻塞每个请求。为什么?如果必须从磁盘读取文件,请异步执行。如果文件是静态的并且很小,也许您应该将其加载到内存中一次。

if (!transparent) {
          console.log('!transparent');
          transparent = fs.readFileSync(__dirname + '/../../public/images/transparent.gif', 'binary');
      }

How often is transparent false? I don't see it defined in the code. You're blocking the entire Node process on synchronous disk IO, potentially for every request. Why? If you have to read the file from disk, do it asynchronously. If the file is static and small, maybe you should load it into memory once.

一桥轻雨一伞开 2024-11-12 13:03:42

Node 的 http 服务器默认具有 keep-alive 功能。它会在您的情况下导致太多无用的连接。只需尝试添加一个标头来禁用 Keep-Alive - 带有集群的普通节点就可以了。

res.setHeader("Connection", "close")

Node's http server has keep-alive by default. It causes too many useless connections in your case. Just try adding a header to disable Keep-Alive — a plain node with cluster would be fine.

res.setHeader("Connection", "close")
我只土不豪 2024-11-12 13:03:42

只是更新:

我已经删除了集群并在服务器上放置了 Nginx 层。因此,“降级”花费了更长的时间,但它仍在这样做,特别是消耗了大量的系统内存。
有什么想法吗?

非常感谢所有的答案!

编辑:
重新做了一些测试。我认为主要问题与开放连接有关。当我在 Nginx 端口上运行 netstat 时,它显示有 2000 个连接。当我在每个 Nodejs 应用程序端口上运行时,它会显示 2000(或更多)。基本上我的“最佳情况”是 Nodejs 应用程序上打开的连接总和将与 Nginx 端口上的打开连接相匹配,对吗?我认为这是主要问题,它影响着巨大的“time_wait”状态。

Just an update:

I've dropped the cluster and put a Nginx layer on the server. So it took a lot longer to "degrade" but it is still doing that, especially consuming a lot of the system ram memory.
Any thoughts?

And thanks a lot for all the answers!

Edit:
Re-did some tests. I think that the MAIN PROBLEM is concerning the open connections. When I run a netstat on the Nginx port it says like 2000 connections. When I run on each nodejs app port it says 2000 ( or more ). Basically my "best case scenario" would be that the sum of the open connections on the nodejs apps would match the open connections on the Nginx port, right? I think that this is the main problem and it's affecting the huge "time_wait" statuses.

南街九尾狐 2024-11-12 13:03:42

您可能还想从缓冲区中提供透明 gif,如下所示:

https://gist.github。 com/657246#comments

You might also just wanna serve up that transparent gif from a buffer as done here:

https://gist.github.com/657246#comments

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文