You Should Create Unique Content
From Panda’s initial launch Google has been gradually updating the algorithmic formula for the Panda. These changes have been aimed at reducing content duplication primarily while allowing for better returns for searchers when searching Google. Quality content can not be stressed enough. Panda filters out the low quality content on content farms or content that is thin or poorly written.
Making unique content is an essential part of SEO today. If you were an article spinner you may have been hit pretty hard by these Panda updates. A large amount of the content being created was being written for the search engines and not for the individuals reading the material. The Internet has always had an absorbent amount of spam and this spam was being created at an alarming rate.
First of all, the reduction in content allows the search engines to index content faster; which in turn produces better results. Today’s search engine markets are trying to produce the best results for the searchers. These results have improved because of quicker indexing and better quality content and sources. The impact of these changes has been enhanced by the implementation of the Panda by forcing better content and quality.
Content Spin off Test
This need to produce quality vs. quantity is still of the utmost importance today. Writing informative and helpful content that educates and adds value to someone reading your material is a good example of quality content. During the first stages of the Panda I myself wrote one article and spun it into five different articles for testing purposes. This allowed me to measure the impact of the Panda while the process was starting. Although I did not see a significant decline in results, there definitely were some adverse effects from doing this testing.
Although I did manage to alter the different versions of the content enough that it possibly made them fairly unique actually. I think where the real adverse effect came to play was when Google Panda actually started penalizing the sites the articles were submitted too. This is the only reason the site decreased for a short time in ranking. The sites the articles were submitted to, may have lost some of their luster or juice because of Panda. All in all, not sure of the correct reason, but since then the site has recovered.
Content Spin off Results
Since completion of the testing it was clear that the results and benefits of writing unique content is for sure the correct way to go. There are several reasons for this:
- Proving you’re an authority in your industry
- Makes it easier to claim your content
- No risk of declining results
- Knowing you followed guidelines
- Not being a spam contributor
- Not being hit by DCMA
Duplicated Content on Site
Duplicate content on a website is another area Panda has affected. An example of this type of content is when you have more than one page with the same information on each page. This is considered duplicate content. Although this is not a copyright issue, you essentially are producing the same content when placing the same information on your pages within the same domain. Why should this matter, you might ask? Well the goal of Panda here was to prevent the creation of pages aimed at just creating pages for the sake of it. Some websites were posting the same content on different pages to generate more keywords in titles and descriptions to produce more results. This is actually a spam method/strategy focused at capturing more search engine results based on mass production.
Although this was one of the principles behind the Panda, I can not actually say I saw much of an impact concerning duplicate content on websites. For example, some sites have multiple landing pages directed at different locations, but the content is the same except for the meta title’s and description’s and a few minor changes to location names on the pages. I suppose these can be considered landing pages, but the Panda do not seem to pay much attention to the fact that some of these sites have the exact same content on the pages, hence duplicate content. The only difference as said already, was the title’s and descriptions. Some of the descriptions were even the same except for a location name.
Avoiding Duplicate Content Issues
- Make use of canonical URL
- Use noindex and nofollow
- Avoid identical content
- Do not scrap for content
- Make sure body, title and description tags are different
Conclusion of Duplicate Content
Not exactly sure how much has to be changed to make it unique enough not to matter, but apparently not much from what I can tell at this point. The proof is in the pudding. I can truly come to this diagnosis because I have done the thing’s written about here on this post as well as tested it. One site in particular still has solid results in the SERP’s and has tons of duplicate content to this day. I believe it is safe to say that website’s with high authority were not hit to bad even if duplicate content is on the sites. Although the pages with duplicate content from the same domain may have been penalized on a page per page bases the overall domain was not impacted.
Where you effected by duplicate content or were your outcomes the same?
Google Panda Release Dates
Here is a complete list of the Panda updates provided by Search Engine Land:
Panda Update 1.0, Feb. 24, 2011 (11.8% of queries; announced; English in US only)
Panda Update 2.0, April 11, 2011 (2% of queries; rolled out in English internationally)
Panda Update 2.1, May 10, 2011 (no change given; confirmed, not announced)
Panda Update 2.2, June 16, 2011 (no change given; confirmed, not announced)
Panda Update 2.3, July 23, 2011 (no change given; confirmed, not announced)
Panda Update 2.4, Aug. 12, 2011 (6-9% of queries in many non-English languages; announced)
Panda Update 2.5, Sept. 28, 2011 (no change given; confirmed, not announced)
Panda Update 3.0, Oct. 19, 2011 (about 2% of queries; belatedly confirmed)
Panda Update 3.1, Nov. 18, 2011 (less than 1% of queries; announced)
Panda Update 3.2, Jan. 18, 2012 (no change given; confirmed, not announced)
Panda Update 3.3, Feb. 27, 2012 (no change given; announced)
Panda Update 3.4, March 23, 2012 (about 1.6% of queries impacted; announced)
Panda Update 3.5, April 19, 2012 (no change given; belatedly revealed)
Panda Update 3.6, April 27, 2012 (no change given; first update within days of another)
Panda Update 3.7, June 9, 2012 (1% of queries; belatedly announced)
Panda Update 3.8, June 25, 2012 (about 1% of queries; announced)
Panda Update 3.9, July 24, 2012 (about 1% of queries; announced)
Panda Update 3.91, Aug. 20, 2012 (about 1% of queries; belatedly announced)
Panda Update 3.92, Sept. 18, 2012 (less than 0.7% of queries; announced)
Panda Update 20, Sept. 27, 2012 (2.4% Of English Queries Impacted)
Panda Update 21, Nov. 5, 2012 (1.1% of English queries in US; 0.4% worldwide; confirmed, not announced)
Panda Update 22, Nov. 21, 2012 (0.8% of English queries were affected; confirmed, not announced)
Panda Update 23, Dec. 21, 2012 (1.3% of English queries were affected; confirmed, announced)
Panda Update 24, Jan. 22, 2013 (1.2% of English queries were affected; confirmed, announced)
Panda Update 25, 2013 (confirmed as coming; not confirmed as having happened)
Thank you for reading and all questions or comments are welcome. If this is helpful pass it along, join our news letter or just share a little. Will try to update this list when the next release comes out from Google.
“Do not let the Panda eat all your website traffic.”
Search engine submission refers to the process through which a webmaster directly submits a website to the search engines. Most people look at this process as a way through which they can increase the rankings of their site. However, other people do not take the concept of search engine submission seriously for a few simple reasons. For one thing the search engines contain crawlers, bots and spiders that will, at a given point, go through each website on the internet once your site has been submitted at least once. This process is known as being crawled and is not the same as being indexed.
Secondly, once the search engines know your site exist and links have begun to be established it no longer becomes necessary to submit your site unless all your links are of “no follow”, which is highly un-likely that will occur. The crawlers and bots know you exist once they scan a link that leads back to your site that is of “follow”. Although there are some search engines that require a regular submission to remain in the database, but most of these are irrelevant because the majority of your search traffic is going to come from the major search engines.
One of the primary reasons as to why search engine submission is carried out is for the purposes of adding a new domain or because you have changed the structure of your website significantly. This allows the search engines to capture all updates and additions made to the website. There is one area of concern with search engine submission that is a topic of discussion in the SEO industry.
If you have a static website and you’re not making changes to it on a regular basis, the need to submit your site to the search engines may be of greater concern. This is due in part because your site has remained dormant for long lengths of time and the engines begin to drop you from the results. With that being said, the search engines like fresh content and work on time stamps to signify new content or changes to a website. This is one of the main reasons you should be producing and updating content on your site at a regular pace to keep the search engine crawlers and bots returning. This helps prevent the necessity of search engine submission because your site is no longer remaining dormant to the crawlers and bots.
Other Forms of Search Engine Optimization Services
Apart from search engine submission, other forms of search engine optimization should be implemented as well to increase site traffic and conversions. Some examples are: creating quality content, directory submission, press releases, really simple syndication and optimizing a website to achieve maximum performance. If a site is properly optimized, it is possible for it to be ranked high using low competitive keywords sometimes referred to as generic keywords. This is because the competition for certain keywords are less significant than other keywords. You can find success by finding the right niche for your industry by utilizing lower priority keywords. Finding niches is effective because it allows you to find that small percentage of customers you’re missing.
Research shows that webmasters, who dedicated their time to utilize search engine optimizationtips, have had their pages gaining more traffic. This is unlike the webmasters who have not utilized search engine optimization techniques at all. People who do successful internet marketing utilize search engine optimizationtechniques and continue to keep up to date with changes in the field of SEO.
There are two top methods for effective search engine optimization. These methods are on-page and off-page search engine optimization. These are themethods that have been used all a long and continue to be the de facto’ for search engine optimization. It is just not about building links, you must remain current and up to date with new SEO processes and remain dedicated to the dos’ and don’ts of SEO. However there are numerous tools available that can aid you in the search engine optimization process on the internet. Take advantage of these tools if you’re trying to do it on your own.
Some of these tools allow for analysis of your site and give you indications of your site position in search engines which can assist you in the process. Other tools include keyword research help, site metric analysis, HTML suggestions, webmaster tools and guidelines, keyword density check, bad link checkers, word count on pages, heading, title and description issues and more. Use all available tools you can get your hands on, do the proper research and you’re bound to see gains.
“Code for all search engines the best you can.”