漂亮的 URL VS.重复内容

发布于 2024-10-02 22:04:09 字数 728 浏览 0 评论 0 原文

我正在尝试清除关于这个热门话题的灰色区域...

像大多数开发人员一样,我使用 mod_rewrite 制作了一些漂亮的 URL。我的网站内部链接指向漂亮的 URL,并且一切运行良好。

但是,如果我直接指向旧的 URL,我仍然可以访问它。

现在,这肯定会导致重复内容问题,因此在进行一些研究之后,似乎301 重定向是最佳选择。

但是...这是灰色的部分...

如果您正在开发一个具有数千个 URL 的网站,那么实现此目的的最佳实践是什么?我不想在 .htaccess 中列出 1k+ 行,我在重写规则中想到了正则表达式,但是我漂亮的 URL 中包含数据库中的名称...并且我无法从 .htaccess 访问该名称:)

我点击了吗死胡同?有办法解决这个问题吗? Google 的 规范标签可能吗?

I'm trying to clear up a grey area about this much talked about topic...

Like most devs, I've made some pretty URLs with mod_rewrite. My sites internal links point to the pretty URLs and things are working nicely.

But, I can still access the old URL if I point to it directly.

Now, this is most certainly going to cause duplicate content issues so after doing some research it seems that 301 redirects are the way to go.

But.... and here's the grey bit...

If you are working on a site with thousands of URLs, what's best practice to achieve this? I don't wantto list 1k+ lines in .htaccess I thought of a regexp in my rewrite rule, but my pretty URLs have names from the database in them... and I can't access that from .htaccess :)

Have I hit a dead end? Is there a way around this? Would Google's canonical tag be a possibility??

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

‘画卷フ 2024-10-09 22:04:09

好吧,我不知道这是否是“最终”答案,但我有一堆“功能”URL,例如:

http://www.flipscript.com/product.aspx?cid=7&pid=42&ds=asdjlf8i7sdfkhsjfd978

但我重新映射 URL,链接到它们并将它们在我的站点地图中列出为:

http://www.flipscript.com/ambigram-ring.aspx

我没有看到任何证据表明相同的 URL 指向相同域内的相同内容会对 SEO 产生负面影响。

事实上,在过去的一年里,我的主要关键词已经在 Google 上攀升至第一名。

我关于为什么会这样的理论是,谷歌对整个“克隆网站”应用重复内容惩罚,而不是仅仅用不同的 URL 链接到单个网站中的相同内容。

Well, I don't know if this is the "definitive" answer, but I have a bunch of "functional" URLS like:

http://www.flipscript.com/product.aspx?cid=7&pid=42&ds=asdjlf8i7sdfkhsjfd978

but I remap the URLs, link to them and list them in my site map as:

http://www.flipscript.com/ambigram-ring.aspx

I haven't seen ANY evidence that identical URLS pointing to the same content within the same domain has any negative impact on SEO.

In fact, over the past year, I have climbed to the #1 position on Google with this in place for my primary keyword.

My theory about why this should be so is that Google applies the duplicate content penalty for entire "clone sites", not for just linking with different URLs to the same content within a single site.

蓝眼睛不忧郁 2024-10-09 22:04:09

一种快速的肮脏方法是通过 PHP 文件重新路由站点上的所有内容,该文件检查路径是否仍然有效,并在必要时查询数据库。如果路径已永久移动,请使用 301 重定向。很快这些“灰色网址”就不会再出现了,索引也应该在搜索引擎中更新。此时您可以删除路由器。

如果您可以指定您的“灰色网址”是什么样子,我也许可以建议一个更好的替代方案。

A quick dirty way would be to re-route everything on the site via a PHP file that checks to see if the path is still valid, querying the database if necessary. Use a 301 redirect if the path has permanently moved. Soon enough these "grey urls" should hardly ever come across, and indexes should be updated across search engines. At which point you can remove the router.

If you could specify what your "grey url" looks like I may be able to suggest a better alternative.

为人所爱 2024-10-09 22:04:09

“谷歌的规范标签有可能吗?” ——为什么不呢?

-->它会自动转移页面排名

-->即使内容略有不同但或多或少相似,Google 仍建议使用规范标签。

-->太多 301 重定向到网站内的页面对 SEO 不利(我个人使用 Bing 的经验)。

--> 301 重定向也可能会增加用户内容的有效加载时间(如果从他们的位置到您的服务器的 ping 时间较长,情况尤其糟糕)。

"Would Google's canonical tag be a possibility??" -- Why not?

--> It automatically transfers page rank

--> Google recommends canonical tag even if the content differs slightly but is more or less similar.

--> Too many 301 redirects to pages within site are bad for SEO (my personal experience with Bing).

--> Too may 301 redirects increase the effective load time of content for your users (especially bad if the ping times from their location to your server is high).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文