The best of page size - TOPUPSEARCH.COM

Mobile Menu

Label

Season

Genre

Popular Post All time

Top Ads

Responsive Leaderboard Ad Area with adjustable height and width.

More News

logoblog

The best of page size

Monday, October 21, 2019
The best of page size

Many times I see questions on page size, still revolves around the theme of how to get the standard page size. Determining the page size is a very vague standard, the determination of this information is vague. I would just write this to share with you their opinion on the issue of a page size that is good for SEO. In this article we will go into the analysis based on two factors, one is an algorithm and the actual test so that you can have the best overview of this. Also, the article provides free tools that can help you analyze and quantify the page size.

Algorithm: Currently, modern technology offers a maximum speed high bandwidth internet, so you do not need to pay attention to page size.

Fact: One page is too heavy (more than 100k, quantitative criteria is from a long time ago), it can be difficult to google index all the information fully. Google bot activities on specific fund resources, if it must spend more time to crawl image and pdf files will have very little time for other parts.
Algorithm: The content should not exceed 1000 words.

Fact: Actually, there are no standards for any length of content. You can use from 2000 or 3000, I’ve seen several sites from terrorist longer than the 2000 figure but it was full Google index, not an error anymore. The problem is you need to make sure the content towards the comfort of the reader, the rest will take is all crawlers.

Algorithm: Google can not crawl more than 100 links on a page.
Fact: For a long time had Mattcutts blog posts on this issue, the article also mentioned that Mr. Mattcutts google has enhanced long spider crawling over 100 link/1trang. But he still recommends the webmaster should limit the number of 100 links on a page, the index of google to be easier. Furthermore, this website will help your paper from paralysis cases in spam links, or link farms.
Some tools help you analyze and quantify the site page:

I want to mention this part because there are 16% of traffic to your website will leave the website if they have to wait over 10 seconds, and double that number if they had to wait for over 15s (according to statistics of eMarketing). So obviously you can take almost half the traffic and the web site just because you load too slowly. This fact is very often the case with some web or blog template null lean not copyrighted.

The best solution is to try to improve the ability to understand and make use of server sites weight down as low as possible. Games are just around 150k if under 100k is better because the weight like this would be consistent with the programs and entire cache of google cache.
Web Page Speed Report help analyze your entire site including:
total page size;
the total size of the images (and HTML and CSS images separately);

  • JavaScript size;
  • CSS size;
  • Each page object size;

And download times for a set of connection rates:
Page Size Extractor is a summary table but is fully

  • Total page size;
  • Text to HTML ratio;
  • Total hyperlinks number;
  • Total images number;
  • The total size of all images;
  • Each image size;
  • The full list of all links on the page.

Firefox Web Developer Extension is a summary but rather full of images and content coding
Get rid of all inessential page elements;

  • Clean up your CSS;
  • Get rid of frames;
  • Compress your images;
  • Clean up your HTML, etc

No comments:

Post a Comment