如果 Mod-Rewrite 被禁用,如何使 PHP 应用程序使用实际的 URL?

发布于 2024-09-11 04:55:02 字数 311 浏览 10 评论 0原文

我正在开发一个 PHP 应用程序,一切都很完美,唯一的问题是。

我启用了 SEO 友好 URL,它将实际 URL 重写为虚拟 URL(我知道你们都知道)

例如:hxxp://www.website.com/index.php?page=about-us 到 hxxp://www.website.com/page/about-us/

我想要实现的是,如果禁用 SEO URL/Mod 重写,用户应该能够访问直接/实际 URL。

简而言之,如果启用 Mod-Rewrite,Web 应用程序应自动使用 SEO 友好 URL,否则使用默认 URL。

I am working on a PHP Application, Every thing works perfectly, The only problem is.

I have enabled SEO Friendly URL's, Which re-writes the actual URL's to virtual URL's( i know you guys know it )

Ex : hxxp://www.website.com/index.php?page=about-us
To
hxxp://www.website.com/page/about-us/

What i want to achieve is If the SEO URL's / Mod Rewrite is disabled, the user should be able to access the direct/actual URL's.

In brief, If Mod-Rewrite is enabled, the web application should automatically use the SEO Friendly URL's otherwise go with the default URL's.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

撩人痒 2024-09-18 04:55:03

您必须将所有出现的链接替换为检查 mod_rewrite 是否可用或更可能是配置值的函数。然后它将返回适当的链接。

getLink("?page=about-us")

You would have to replace all occurrences of links with a function that checks if mod_rewrite is available, or more likely, a config value. It would then return the appropriate link.

getLink("?page=about-us")

知足的幸福 2024-09-18 04:55:03

使用以避免破坏其他 .htaccess 指令和/或 500 个内部服务器错误(如果 Apache 不理解您的规则)。还添加一个非重写重写规则(在所有其他规则之前);

<IfModule mod_rewrite.c>
    RewriteEngine On
    #The next rule does no rewriting, but sets en environmental variable.
    RewriteRule .* - [E=RewriteCapable:On]
</IfModule>

在您的文件中(存储为设置或检查生成/输出 url 的位置):

if(isset($_SERVER['RewriteCapable'])){
    //make fancy urls
} else {
    //cludgy old-style urls
}

Use an <IfModule> to avoid breaking other .htaccess directives and or 500 internal server errors if Apache doesn't understand your rules. Also add a single non-rewriting rewriterule (before all others);

<IfModule mod_rewrite.c>
    RewriteEngine On
    #The next rule does no rewriting, but sets en environmental variable.
    RewriteRule .* - [E=RewriteCapable:On]
</IfModule>

In your file (store as setting or check on places generating/outputting urls):

if(isset($_SERVER['RewriteCapable'])){
    //make fancy urls
} else {
    //cludgy old-style urls
}
若无相欠,怎会相见 2024-09-18 04:55:03

嗯.. 这可能需要考虑一下才能获得正确的解决方案.. 如果您愿意的话,请跟随我:)

SEO URL 主要是为了 (1) 在 URL 中包含人类可读的文本以及 (2) 摆脱获取参数。

暂时看一下第 (2) 点,这是最初的主要驱动因素,因为人们使用 about.php?id=1, id=2 ... id=3457348 多次在搜索引擎中列出相同的页面,这当然会被检测到并停止,然后有时人们会传递一个会话 id=24234234 ,它也会因为重复页面而被停止(正确的是,因为它使用 HTTP 作为有状态协议,而实际上不是)。

对于 URL,从第一个字符到 #fragment 的 # 的所有内容都定义了资源(从 HTTP 角度来看),因此,当多个不同的 URL 全部解析到同一“页面”时,它们确实是重复的。

因此,通过否定 GET 参数就可以解决这个问题,顺便说一句,这现在已经不是问题了,而且已经很长时间了,除了虚荣之外,没有理由不正确使用 GET 参数。

所以,实际上你没有解决任何问题,而是引入了一个新问题,因为你希望“/page/about-us”和“?page=about-us”都转到同一个“页面”,这意味着你已经再次获得重复的资源,这可能会被检测到,并且您可能会受到惩罚。

因此,通过引入“SEO URL”,您实际上已经创建了“SEO URL”的“发明”来解决这个问题。

这只留下了关于 URL 中人类可读单词的问题。 URL 应该是透明的,所以它们在现实中没有任何意义,但有些仍然喜欢 - 所以我不得不问使用“/?/page/about-us”有什么问题......如果你不这样做不喜欢那样,那么使用文件系统路径“/page/about-us”创建一个固定文件有什么问题,该文件只包含带有正确变量集的index.php?

当然,您可以创建重复的页面,并同时拥有 SEO 友好的 url 和 GET 参数 URL,但正如您所看到的,现在这对 SEO 不友好,不是吗?

可以咀嚼的东西:)

Hmm.. this may need thought about a bit to get the correct solution.. follow me here if you will :)

SEO URLs were primarily introduced to (1) include human readable text in the URLs and (2) to get rid of the GET parameters.

To look at point (2) for a moment, this was the primary driver initially, because people used about.php?id=1, id=2 ... id=3457348 to get the same page listed in the search engines multiple times, which of course got detected and stopped, then sometimes people would pass a session id=24234234 which would also get stopped as being a duplicate page (rightfully as it uses HTTP as a stateful protocol when it's not).

With an URL, everything from the first char up to a the # of a #fragment defines a resource (from an HTTP perspective), so rightly so when several different URLs all resolve to the same 'page' they are indeed duplicates.

So, by negating the GET parameters you solve this problem, which now isn't a problem by the way and hasn't been for a long time, there's no reason not to use GET params properly other than vanity.

So, really you solve no problem but have instead introduced a new problem, in that you want '/page/about-us' and '?page=about-us' to both go to the same 'page' which means you've got duplicate resources again and this could be detected and you could get penalised.

Thus, by introducing 'SEO URLs' you've actually created the problem SEO URLs were 'invented' to counteract.

This only leaves the point about human readable words in the URL. URLs are supposed to be transparent so they don't count for anything in reality, but some still like - so I'd have to ask what's wrong with using '/?/page/about-us'... and if you don't like that then whats wrong with creating a fixed file with the filesystem path '/page/about-us' which simply includes your index.php with the right variables set?

Of course you can create duplicate pages and have both SEO friendly urls and GET param URLs but as you can see that won't be SEO friendly now will it?

Something to chew on :)

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文