使用三台服务器对 PHP Web 应用程序进行负载平衡

发布于 2024-11-18 06:17:13 字数 532 浏览 4 评论 0 原文

我有 2 台 Web 服务器和 1 台用作反向代理或负载均衡器的服务器。 2 个 Web 服务器具有真实/公共 IP 以及负载均衡器。负载均衡器服务器尚未配置,因为我尚未决定哪个选项最适合我的 Web 应用程序。我知道一个负载均衡器是有风险的,因为“单点故障”,但我想使用它。

2 个 Web 服务器包含多个具有相同域和不同子域的 PHP 应用程序/虚拟主机(Apache + fastcgi)。它们都使用会话、cookie,其中一些需要 SSL。我的主要目标是将传入连接分成两半并将它们转发到 2 个网络节点。如果一个网络节点离线,另一个网络节点应该接受所有连接。我认为如果需要的话我们可以使用内存缓存进行会话共享。

我阅读了有关 Nginx、HaProxy 以及我在网上可以找到的任何其他应用程序的信息。但我无法决定,因为:

1)我有 1 台服务器可以用作负载均衡器,但我在网上找到的所有配置都需要 2 个负载均衡器节点。 2) 我不确定应该在哪里安装 SSL 证书(在负载均衡器或 Web 节点上)以及使用 HTTPS 连接时哪种解决方案最好。

需要任何帮助/想法,非常感谢。

I have 2 web servers and 1 server that is intended to be used as reverse proxy or load balancer. 2 web servers have real/public IPs as well as the load balancer. Load balancer server is not configured yet because I did not decide which option will be best for my web applications. I am aware of one load balancer is risky because "single point of failure" but I want to go with it.

2 web servers contain more than one PHP application/vhost (Apache + fastcgi) with same domain and different subdomains. They all use sessions, cookies and some of them require SSL. My main goal is purely divide incoming connections in half and forward them to 2 web nodes. And if one web node goes offline, the other should accept all connections. And I think if needed we can do session sharing with memcache.

I read about Nginx, HaProxy and any other application I can find on the net. But I cannot decide because :

1) I have 1 server that I can use as load balancer but all configuration I found on the net needs 2 load balancer nodes.
2) I am not sure where should I install SSL certificates (on load balancer or web nodes) and which solution is the best while using HTTPS connections.

Any help/idea is needed, thanks a lot.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

无声静候 2024-11-25 06:17:13

我在类似的设置中使用过 HAProxy,我强烈推荐它。它可以在单个服务器上运行得很好,如果您需要高可用性,您只需要两台服务器即可进行负载平衡。

有很多教程,例如 提供了有关如何在您寻求的配置中进行设置的详细说明。

I've used HAProxy in a similar setup and I highly recommend it. It can run very well on a single server and you only need two servers for load balancing in case you want high availability.

There are many tutorials such as this that provides detailed explanations on how to set it up in the configuration you seek.

安稳善良 2024-11-25 06:17:13

我昨天刚刚创建了一个配置,其中 NGINX 服务器作为负载均衡器,后面有 2 个 PHP-FPM 服务器、1 个 Memcache 服务器和 1 个 MySQL 服务器。 NGINX 是使用 Upstreaming 功能配置的,相关的配置行如下所示:

html {

    ...

    # loadbalancing        
    upstream myLoadBalancer {
        ip_hash; # makes sure same user uses the same server, not 100% effective - application
                 # should handle this; in my case 1 Memcached and 1 MySQL servers commonly used
                 # by all App-servers work just fine. I store sessions in Memcache, so Session 
                 # management isn't a problem at all. Its shared across all App-servers.
        server 192.168.1.10:9000; # location of my first php-fpm server
        server 192.168.1.11:9000; # second php-fpm server
        # server aaa.bbb.ccc.ddd:80; # let's say, an Apache server
    }

    #vhost
    server {
        listen 80;
        server_name mydomain.com;
        index index.php;

        location ~* \.php$ {
            gzip on;
            try_files $uri =404;
            include fastcgi_params;
            fastcgi_pass myLoadBalancer;
            fastcgi_index index.php;
            fastcgi_param SCRIPT_FILENAME /path/to/webdir$fastcgi_script_name;
            fastcgi_param PATH_INFO $fastcgi_script_name;
        }
    }
}

HTTPS: 如果我没记错的话,它们应该安装在 NGINX 负载平衡器上 - 我自己从未尝试过。一旦客户端请求被传递到应用服务器,它将处理它,获取请求并将其发送回NGINX。 NGINX 在将响应转发给客户端之前,将对内容进行加密。这当然是一个理论。但我 100% 肯定 NGINX 可以轻松处理 SSL。当与各种 CLI 的 FASTCGI 功能相结合时,它是最快、最强大的代理/平衡器和良好的 Web 服务器。

注意:这不是生产环境的配置,而是测试用例场景。生产环境需要很多安全设置。可以使用以下资源:

I just yesterday created a configuration where NGINX server as a load balancer, behind which there are 2 PHP-FPM servers, 1 Memcache server and 1 MySQL server. NGINX was configured using Upstreaming feature and the related configuration lines are something like this:

html {

    ...

    # loadbalancing        
    upstream myLoadBalancer {
        ip_hash; # makes sure same user uses the same server, not 100% effective - application
                 # should handle this; in my case 1 Memcached and 1 MySQL servers commonly used
                 # by all App-servers work just fine. I store sessions in Memcache, so Session 
                 # management isn't a problem at all. Its shared across all App-servers.
        server 192.168.1.10:9000; # location of my first php-fpm server
        server 192.168.1.11:9000; # second php-fpm server
        # server aaa.bbb.ccc.ddd:80; # let's say, an Apache server
    }

    #vhost
    server {
        listen 80;
        server_name mydomain.com;
        index index.php;

        location ~* \.php$ {
            gzip on;
            try_files $uri =404;
            include fastcgi_params;
            fastcgi_pass myLoadBalancer;
            fastcgi_index index.php;
            fastcgi_param SCRIPT_FILENAME /path/to/webdir$fastcgi_script_name;
            fastcgi_param PATH_INFO $fastcgi_script_name;
        }
    }
}

HTTPS: They should be installed on NGINX load-balancer if I am not mistaken - never tried myself. Once client request is passed down to App-server, it will process it, fetch the request and send it back to NGINX. And NGINX, before relaying the response to the client, will encrypt the content. Its a theory of course. But I am 100% positive NGINX can handle SSL quite easily. It is the fastest and the most robust proxy/balancer and good webserver when combined with FASTCGI capabilities of various CLIs.

NOTE: This is not a configuration for production environment, but a test-case scenario. Production environment would require much secure settings. Following resources could be of use:

蓝戈者 2024-11-25 06:17:13

在任何情况下,如果您希望 HTTPS 连接到超过 1 个 Web 服务器,理想的解决方案是在 Web 服务器之前安装 Nginx。它也可以用于负载平衡,但如果您需要更复杂的配置选项,那么最好将 Nginx 请求转发到您的 HAProxy 实例。这两个服务都使用最少的资源,因此不必担心同时运行这两个服务。

我熟悉只拥有 3 台服务器并且不需要冗余负载平衡器的想法。实际上,我写了一本用于扩展架构的电子书,其中包括一些具有 2-3-4 个服务器和仅 1 个负载均衡器的示例。也许它可以提供更多信息并帮助您入门。

In any situation where you want HTTPS to more than 1 web server, your ideal solution would be to have Nginx installed in front of your web servers. It can also be used for load-balancing, but if you need more complexe configuration options, then it's a good idea to have Nginx requests forwarded to your HAProxy instance. Both services use minimal resources so don't worry about running both.

I'm familiar with the idea of only having 3 servers and not wanting redundant load-balancers. I actually wrote an eBook for scaling architectures and it includes some examples with 2-3-4 servers and only 1 load-balancer. Maybe it can provide more information and help you get started.

黑凤梨 2024-11-25 06:17:13

您只需一个 HAProxy 节点就可以完成您需要的操作(尽管这仍然会导致单点故障)。

我写了一个 在 Rackspace Cloud 上安装 HAProxy 的教程,它可以适应任何 Ubuntu 设置。通过使用cookie选项,您还可以强制会话持久性,因此无需在盒子之间共享会话,只有当用户所在的盒子在会话中途出现故障时,用户才会丢失会话。

通过 HAProxy 平衡标准 HTTP 流量后,您还可以使用 mode tcp 选项发送 SSL。这无法将 cookie 插入到请求中,因此请使用 sourcebalance 模式。此平衡基于用户的 IP,因此不会在会话中更改,除非您添加其他节点。

然后,您的 SSL 证书将安装在两个 Web 节点上,因为 HAProxy 只是平衡 TCP 流量。顺便说一句,您可以将这种平衡模式用于任何通过 TCP 的连接,包括用于某些真正高可用性解决方案的 MySQL 连接。

You can do what you need with just one HAProxy node (although this still leaves you with a single point of failure).

I have written a tutorial for installing HAProxy on Rackspace Cloud, which can be adapted to any Ubuntu setup. By using the cookie option you can also enforce session persistence, so no need to share sessions between boxes, users will only lose their session if the box they are on goes down mid session.

Once you have your standard HTTP traffic balanced through HAProxy you can then send your SSL through as well using the mode tcp option. This can't insert cookies into the request so use the balance mode of source. This balances based on the user's IP, so won't change mid-session unless you add additional nodes.

Your SSL certs are then installed on both your web nodes as HAProxy is just balancing the TCP traffic. Incidentally, you can use this balance mode with anything over TCP, including MySQL connections for some real high availability solutions.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文