Using compression does not hurt your page ranking. Matt Cutts talks about this in his article on Crawl Caching Proxy
Your page load time can also be greatly improved by resizing your images. While you can use the height and width attributes in the img tag, this does not change the size of the images that is downloaded to the browser. Resizing the images before putting them on your pages can reduce the load time by 50% or more, depending on the number and type of images that you're using.
Other things that can improve your page load time are:
Use web standards/CSS for layout instead of tables
If you copy/paste content from MS Word, strip out the extra tags that Word generates
Put CSS and javascript in external files, rather then embedded in the page. Helps when users visit more than one page on your site because browsers typically cache these files
This Web Page Analyzer will give you a speed reports that shows how long different elements of your page take to download.
Use gzip compression to compress the HTML in the transport stage, then just make sure that code validates and that you are using logical tags for everything.
The SEO guy mentioned that semantic, valid HTML gets more points by crawlers than jumbled messy HTML.
If a SEO guy ever tries to provide a fact about SEO then tell him to provide a source, because to the best of my knowledge that is simply untrue. If the content is there it will be crawled. It is a common urban-myth amongst SEO analysts that just isn't true.
However, the use of header tags is recommended. <H1> tags for the page title and <H2> for main headings, then lower down for lower headings.
I've been working on a real time HTML compressor that will decrease our page sizes my a pretty good chunk. Will compressing the HTML hurt us in site rankings?
If it can be read on the client side without problem then it is perfectly fine. If you want to look up any of this I recommend anything referencing Matt Cutt's or from the following post.
我建议在传输层使用压缩,并消除 HTML 中的空格,但不要为了速度而牺牲标记的语义。 事实上,标记“压缩”得越好,传输层压缩的效果就越差。 或者,换一种更好的方式来说,让 gzip 传输编码为您精简 HTML,并将您的精力投入到编写干净的标记上,这些标记一旦到达浏览器就可以快速呈现。
I would suggest using compression at the transport layer, and eliminating whitespace from the HTML, but not sacrificing the semantics of your markup in the interest of speed. In fact, the better you "compress" your markup, the less effective the transport layer compression will be. Or, to put it a better way, let the gzip transfer-coding slim your HTML for you, and pour your energy into writing clean markup that renders quickly once it hits the browser.
当您说 HTML 压缩器时,我假设您指的是一种从页面中删除空格等以使其更小的工具,对吧? 这不会影响爬虫如何查看您的 html,因为当它从您的网站抓取页面时,它可能会从 HTML 中删除相同的内容。 无论压缩与否,HTML 的“语义”结构都存在。
您可能还想查看:
在 Web 服务器中使用 GZIP 压缩来压缩页面
减小图像、CSS、javascript 等的大小
考虑浏览器的布局引擎如何加载页面。
通过混乱的 HTML,这个 SEO 人员可能意味着使用表格进行布局和重新利用内置 HTML 元素(例如
Header 1p> )。 这增加了 HTML 标签与页面内容的比率,或者 SEO 术语中的关键字密度。 但它有更大的问题:
由于下载内容增加,页面加载时间更长,为什么不使用 H1 标签呢?
屏幕阅读器很难理解并影响网站的可访问性。
浏览器可能需要更长的时间来呈现内容,具体取决于它们如何解析和使用样式布局页面。
Compressing HTML should not hurt you.
When you say HTML compressor I assume you mean a tool that removed whitespace etc from your pages to make them smaller, right? This doesn't impact how a crawler will see your html as it likely strips the same things from the HTML when it grabs the page from your site. The 'semantic' structure of the HTML exists whether compressed or not.
You might also want to look at:
Compressing pages with an GZIP compression in the web server
Reducing size of images, CSS, javascript etc
Considering how the browser's layout engine loads your pages.
By jumbled HTML, this SEO person probably means the use of tables for layout and re-purposing of built in HTML elements (eg. <p class="headerOne">Header 1</p>). This increases the ratio of HTML tags to page content, or keyword density in SEO terms. It has bigger problems though:
Longer page load times due to increased content to download, why not use the H1 tag?
It's difficult for screenreaders to understand and affects site accessibility.
Browsers may take longer to render the content depending on how they parse and layout pages with styles.
我之所以能节省这么大的成本,是因为该网站的每个页面都嵌入了所有 JS。 我还重新设计了所有 JS,因此它对于 IE6 和 FF2 都是正确的。 这些图像也被编译成图像映射。 所有技术都在 A List Apart 上有详细记录,并且易于实现。
I once retooled a messy tables-for-layout to xhtml 1.0 transitional and the size went from 100kb to 40kb. The images loaded went from 200kb to just 50kb.
The reason I got such a large savings was because the site had all the JS embedded in every page. I also retooled all the JS so it was correct for both IE6 and FF2. The images were also compiled down to an image-map. All the techniques were well documented on A List Apart and easy to implement.
发布评论
评论(7)
首先你检查代码。 该代码验证了 w3c 标准,例如
HTML & CSS
First you check on the code. The code is validate w3c standards like
HTML & CSS
使用压缩不会损害您的页面排名。 Matt Cutts 在他关于抓取缓存代理的文章中讨论了这
一点通过调整图像大小也可以大大缩短时间。 虽然您可以在 img 标签中使用高度和宽度属性,但这不会更改下载到浏览器的图像的大小。 在将图像放入页面之前调整图像大小可以将加载时间减少 50% 或更多,具体取决于您使用的图像的数量和类型。
其他可以缩短页面加载时间的方法包括:
而不是表格
来自 MS Word 的内容,去掉
Word 生成的额外标签
外部的 CSS 和 javascript
文件,而不是嵌入到
页。 当用户访问更多时有帮助
网站上不止一页,因为
浏览器通常会缓存这些文件。
此网页分析器将为您提供速度报告,显示多长时间下载页面的不同元素。
Using compression does not hurt your page ranking. Matt Cutts talks about this in his article on Crawl Caching Proxy
Your page load time can also be greatly improved by resizing your images. While you can use the height and width attributes in the img tag, this does not change the size of the images that is downloaded to the browser. Resizing the images before putting them on your pages can reduce the load time by 50% or more, depending on the number and type of images that you're using.
Other things that can improve your page load time are:
instead of tables
content from MS Word, strip out the
extra tags that Word generates
CSS and javascript in external
files, rather then embedded in the
page. Helps when users visit more
than one page on your site because
browsers typically cache these files
This Web Page Analyzer will give you a speed reports that shows how long different elements of your page take to download.
在传输阶段使用 gzip 压缩来压缩 HTML,然后确保代码有效并且您对所有内容都使用逻辑标记。
Use gzip compression to compress the HTML in the transport stage, then just make sure that code validates and that you are using logical tags for everything.
如果一个 SEO 人员试图提供有关 SEO 的事实,请告诉他提供来源,因为据我所知,这根本不真实。 如果内容存在,它将被抓取。 这是 SEO 分析师中常见的都市神话,但事实并非如此。
但是,建议使用标头标签。
页面标题和
标签 对于主标题,然后向下对于较低的标题。
如果可以在客户端毫无问题地读取它,那就完全没问题了。 如果您想查找其中任何内容,我建议您参考 Matt Cutt 的文章或以下帖子中的任何内容。
常见问题解答:搜索引擎优化
If a SEO guy ever tries to provide a fact about SEO then tell him to provide a source, because to the best of my knowledge that is simply untrue. If the content is there it will be crawled. It is a common urban-myth amongst SEO analysts that just isn't true.
However, the use of header tags is recommended. <H1> tags for the page title and <H2> for main headings, then lower down for lower headings.
If it can be read on the client side without problem then it is perfectly fine. If you want to look up any of this I recommend anything referencing Matt Cutt's or from the following post.
FAQ: Search Engine Optimisation
我建议在传输层使用压缩,并消除 HTML 中的空格,但不要为了速度而牺牲标记的语义。 事实上,标记“压缩”得越好,传输层压缩的效果就越差。 或者,换一种更好的方式来说,让 gzip 传输编码为您精简 HTML,并将您的精力投入到编写干净的标记上,这些标记一旦到达浏览器就可以快速呈现。
I would suggest using compression at the transport layer, and eliminating whitespace from the HTML, but not sacrificing the semantics of your markup in the interest of speed. In fact, the better you "compress" your markup, the less effective the transport layer compression will be. Or, to put it a better way, let the gzip transfer-coding slim your HTML for you, and pour your energy into writing clean markup that renders quickly once it hits the browser.
压缩 HTML 不会对您造成伤害。
当您说 HTML 压缩器时,我假设您指的是一种从页面中删除空格等以使其更小的工具,对吧? 这不会影响爬虫如何查看您的 html,因为当它从您的网站抓取页面时,它可能会从 HTML 中删除相同的内容。 无论压缩与否,HTML 的“语义”结构都存在。
您可能还想查看:
通过混乱的 HTML,这个 SEO 人员可能意味着使用表格进行布局和重新利用内置 HTML 元素(例如
Header 1p> )。 这增加了 HTML 标签与页面内容的比率,或者 SEO 术语中的关键字密度。 但它有更大的问题:
Compressing HTML should not hurt you.
When you say HTML compressor I assume you mean a tool that removed whitespace etc from your pages to make them smaller, right? This doesn't impact how a crawler will see your html as it likely strips the same things from the HTML when it grabs the page from your site. The 'semantic' structure of the HTML exists whether compressed or not.
You might also want to look at:
By jumbled HTML, this SEO person probably means the use of tables for layout and re-purposing of built in HTML elements (eg.
<p class="headerOne">Header 1</
p>). This increases the ratio of HTML tags to page content, or keyword density in SEO terms. It has bigger problems though:我曾经将一个凌乱的表格布局重新设计为 xhtml 1.0 过渡,大小从 100kb 变为 40kb。 加载的图像从 200kb 减少到仅 50kb。
我之所以能节省这么大的成本,是因为该网站的每个页面都嵌入了所有 JS。 我还重新设计了所有 JS,因此它对于 IE6 和 FF2 都是正确的。 这些图像也被编译成图像映射。 所有技术都在 A List Apart 上有详细记录,并且易于实现。
I once retooled a messy tables-for-layout to xhtml 1.0 transitional and the size went from 100kb to 40kb. The images loaded went from 200kb to just 50kb.
The reason I got such a large savings was because the site had all the JS embedded in every page. I also retooled all the JS so it was correct for both IE6 and FF2. The images were also compiled down to an image-map. All the techniques were well documented on A List Apart and easy to implement.