SEO in 2014: A Rewind of The Major Developments The Last Year
It’s 2015 and there is a lot to work on now. If you are still reeling under the celebrations on New Year, it’s time that you started the New Year on a positive note. There was a lot of talk about how Google would be changing its SEO strategies, with the arrival of contextual and semantic searches. With mobile phone usage sharply on the rise, Google cannot leave out searches from smart phones too and it was only a matter of time before Google favoured those websites that had a mobile friendly version.
What was SEO in 2014?
With the end of 2014, let us take a look at how far has SEO progressed this year. If you have not been keeping a tab on SEO developments, and Google’s Matt Cutt SEO announcements or Google’s wrath on Press Release websites, this is something you need to read.
It doesn’t matter whether you are cuddling up on your bed right now in the snowy times, or just sipping a cup of coffee in the morning – let’s take a look back at what Google had to offer this year and how SEO has evolved today.
Getting ready for 2015 – Looking at current SEO trends
2015’s just started and you need to be prepared for what’s out there. If you are new to doing business online or just starting to experiment with SEO, you would need to know what the SEO world looks like today.
Here’s a look at what started to trend when it came to SEO this year.
- Brands are important
Okay, important things first. You need to create a brand out of your website. No, it doesn’t matter how many customers visit you at your Portland restaurant. You need to build a brand reputation online.
Yes, having more customers offline will help you get more reviews and comments online, helping you build your brand reputation. Google has increasingly given credence to brands and sites like Wikipedia and other trusted websites come out prominently in search results.
Google loves to give more credibility to posts with a known author name. In many ways, this has helped stop the practice of posting blogs without author names. That Google uses a sort of Author Rank, was mentioned by none other than Matt Cutt, at the Search Marketing Expo stage. Google’s use of Author Rank and ranking in-depth articles goes by a simple philosophy – readers will connect more to posts they know the author to.
Google swayed away from its policy of displaying author profiles and authorship details in full, and you see only a by line mentioned beside the posts now. However, Authorship does continue to be important and as a website owner, you need to have blogs posted with different authors. This helps build customer trust, have a wider reader base and more varied content.
- The other use of social media
When we first talked about social media as a SEO tool, myths were high that Google considers SEO in social media too and higher use of social media makes you rank higher in SEO. Google’s Matt Cutt again let us know the details, stating that Google didn’t favour any website for having a social media strategy.
That being said, social media is a great traffic generator. You might not generate extra SEO juice from Google but you would do something more – get more people to visit your site and that would help build brand recognition and in turn, build on your SEO efforts.
Social media is a great way for earning more referral traffic and content dissemination and Facebook continues to be in the lead. In a research by Shareaholic, Facebook’s referral traffic is still the highest. If you are looking to build a reputation online, social media is the first place you need to be looking at, and it needs to be included in your SEO strategy as well.
- Content marketing
SEO has come to mean lengthier and unique content, more and more now. You need a content marketing plan when you are thinking about SEO today. It’s more about high quality content and we would be talking more about this later in this post, when we talk about the major Google updates this year.
- The new form of guest blogging
Let’s put this at the outset. Traditional form of guest blogging is dead. This, after Guest Blogging reached its peak in 2013. The tactic was used for everything, from building links to adding in more credibility. Then Google took notice of it and how spammers used guest blogging to build brand authority; in January 2014 Matt Cutt stated this in this video and blog post,
“Stick a fork in it: guest blogging is done.”
So, is guest blogging really dead? Matt Cutt explained later on that his comment meant Google aims to penalize those using ‘bad’ guest blogging.
So, what does bad guest blogging refer to?
From the SEO perspective, it refers to guest blogging just for the purpose of getting links without really contributing anything useful. On the other hand, if you blog on a few but prominent sites, and add in content that’s useful and read by many, it’s not going to harm you. After all, every news website from Forbes to Huffington has a guest column. Here’s what Matt said after the first statement:
“There are still many good reasons to do some guest blogging (exposure, branding, increased reach, community, etc.). Those reasons existed way before Google and they’ll continue into the future.”
Guest blogging needs to continue to be in your SEO strategy today as they form a quick and effective way of building your brand and connecting to a wider audience. Don’t just guest blog for any website out there, ensure that you have something useful to share and check the reputation of the site you are planning to guest post in.
- It’s more expensive than before
If there is anything that has become more expensive down the years online, it has to be SEO. The reason is simple – it’s harder to get inbound links and create content that’s share worthy. Plus, take into factor that there is increased competition for keywords and lot fewer keywords in the low competition range, you will understand why the price has been on the rise. While you can continue to have SEO providers telling you that they would rank you on the first page of Google but if someone tells you that they would have you right on the top, you might need to ask a little more – because it’s generally not that easy now!
- It’s important to know how to fill out the meta data
Do you read those two lines written below the link before you actually click the page? Whatever be your answer, Meta descriptions, although does not help to improve the search result ranks, affects the click through rates considerably. Thus, given its not-so-important state, Meta descriptions have largely remained neglected. But this is no more the case. It has been studied that optimized Meta descriptions can actually help o build the SEO.
This can be explained easily.
Although the content of the Meta description does not matter much in the search ranking algorithm, the user behaviors is factored in the search results. The CTR (click-through rate) is a vital part of the algorithmic ranking process and it is with these Meta descriptions that a page will enjoy better CTR through the search results.
Talking from a strict algorithmic perspective, important keywords are not necessary to be included in the Meta descriptions.
Google also analyses user behavior to rank a particular site along with the other parameters. Meta descriptions matter much in these kinds of advanced searches.
Lastly, the CTR is an essential part of the algorithmic ranking process and both Google and Bing seem to make use of this metric. As Google considers user behavior, killer Meta descriptions can help to achieve the best CTRs.
Improved tips for best Meta descriptions
Being descriptive: the Meta description should introduce the main idea about the page which the user is looking for. It should sketch the page’s content so that the person knows whether to click or not.
Being persuasive: involve a touch of persuasiveness to get more clicks! The descriptions which demand a response might call for more people to look for the webpage.
Triggering curiosity: informational queries, as opposed to transactional queries, are one of the best ways to spark curiosity amongst the users. Meta descriptions should be able to speak about the skeleton of the page yet keep the curious factor for the readers and users.
Choice of words: the keywords mainly matter for the users who look for the search engines. The relevant words associated with the users’ query can make a difference between the SERP entry that gets a click and the SERP entry that gets overlooked.
Correctly lengthen descriptions: Google will truncate the Meta description if the Meta descriptions are too long. 156 character long descriptions are the standard length of the Meta descriptions. Lastly, quotation marks should not be used in them otherwise Google will cut them off.
No more of these please!
While trends are an indicator of what’s happening in the SEO world, we have had instances of age old strategies being obsolete.
- Keywords are less important
Before the Google Panda update, we had it pretty easy when it came to ranking content. Just have your content spiced up with keywords that you want to rank for, and Google will do the rest. It’s not that simple now and with semantic searching and crawling by Google, Google understands the context of your content and whether it relates to your keywords when ranking content. In other words, crap content that you create with those article spinners that mean nothing except keywords and are aimed at just brining you to the top of search results will bring you nothing.
- No more Yahoo! Directory
This was the directory that started Yahoo’s journey in 1994. This year, it closed down, in fact five days earlier than what Yahoo had initially planned. Just log onto http://dir.yahoo.com and you will be redirected to https://business.yahoo.com/.
You cannot review any of the listings and this is what you will receive if you click on any of them.
Yahoo! seems to be coming out with its own Yellow-page styled service soon but the closing of the directory means that you will not be able to submit your websites to it like before.
- Focusing on just on-page optimization
Yes, those meta tags still are important. So are the title tags. But there is something more important that you need to know – semantic searches. Google loves to know now what you are writing about rather than just crediting you for web content that only has keywords. Yes, content with no keywords doesn’t justify it but don’t just write on tennis when you have keywords on basketball.
On-page optimization is important, in the sense that it can guide Google on what your page is about but there is a lot more to it than just that. You need to have content that’s relevant to your topic and what you need to write about. Sometime interesting and unique will be given more credence than something writing generic and common- Google knows what readers want.
- It’s no more about small keywords
Instead, it’s more about long tail keywords. Most keyword phrases are in high competition and it is unlikely that you would be getting any good bit of juice by using those keywords, given the time and amount of money you would invest.
If statistics are to be believed, search volume for head terms is actually down by 8% and long tail SEO phrases are what’s needed.
- More pages do not mean more quantity
Remember those times when webmasters used to tell you that having more webpages is a must to get better rankings? I remember that we even had Freewebs.com charging customers on page basis – let’s put it simply – Google doesn’t discriminate on page basis and looks at quality over quantity. Don’t add too much content, instead focus on having content that’s actually right. Low quality content will make you go down quickly and can hurt you real bad. It’s good to take one step at a time, after all you don’t want to stumble. If you want to get things in perspective, consider the example of Wise Geek.
They were once a frontrunner solely because of other large pages of content, so was Hubpages and Squidoo- all of them faced a major penalization from Google in 2013 and 2014, severe enough to make them change their entire strategies of how they work. I am not a personal favorite of either Hubpages or Squidoo, especially of the latter, because they dictate what my content needs and what not, even though I know it doesn’t. But I am sure they will learn their way soon – I do not necessarily need a poll always and in Squidoo you can never tell when it’s high quality for their automated editors and when it’s not. But that’s something different. You do need rich media in your content, and in every post unless you have something really short and sweet to tell, like a page dedicated to news headlines.
The Google Algorithm Updates
Apart from the trends and things that have become obsolete, let’s talk about the major Google updates this year. Post Hummingbird Update, the SEO world has changed in what we think it to be. Hummingbird created some drastic changes to how Google perceives websites and changed the web landscape. Businesses had to go back to the drawing board and had to change everything, including all marketing strategies and SEO efforts, often needing to start afresh altogether. Hummingbird happened in 2013 but some of its effects were visible in 2014 as well.
While plagiarism received a strict no, there is one another area that Hummingbird forced website owners to look back again – how to improve the customer experience? The obvious answers are to post natural content and not to force in the keywords. Experts would tell you that you need to have keywords read naturally within the content, something that has come to place because of the contextual and semantic searches. Website owners focus more on having the right title tags to avoid website bounces and use page URL’s properly.
For instance, your webpage doesn’t need to be named xzy.com/12345. No, if you have a page on selling basketballs, have the URL as www.xyz.com/sell-basketball. It helps in everything from getting SEO juice, to Google favoring you to readers knowing about what you really have to offer. Simple and quite effective. The Hummingbord was the next logical step in Google’s evolution and 2014 saw only the next end of the evolution.
You have probably heard about the Panda updates. Introduced in February 2011, Google Panda update was basically introduced for better search filters. The updates have been basically aimed at penalizing sites with poor quality content from appearing as the top search results of Google. To explain it simply, it is a series of on-going algorithm updates which help to refine the search and improve the value of search query results for users.
What sites does it affect?
The search engine optimization (SEO) industry, along with organizations and web developers followed the Panda updates carefully, as the Panda changes significantly impact the amount of website traffic, whether from natural or organic search results. The local business sites may find a favourable option with the Panda updates and especially for those companies which are truly working for quality.
In May 2014, the Panda 4.0 was released which targeted spam queries. Spams make more than 7.5% of English queries. The Google Panda Algorithm, version Panda 4.1 published their large update in October 2014. However, the last one update seems to be made in October. Many tweets are away this time about how Panda is on Holiday this time around but the truth is Google only divulges the main algorithm updates. More, most times we understand the updates through effect on site statistics of other web owners, so if there are minor changes we might not be able to detect it so easily.
So, what are the statistics?
Here is a look at some of the statistics you might want to know about when it comes to Panda statistics in 2014.
- As per the user survey details from polldaddy, 10.6% (with 44 votes) said that they recovered from the previous Panda penalties. 17.35% with 72 votes thanked that the rankings increased but was not hurt by Panda. 27.23% with 113 votes said that their rankings decreased but was never previously hurt by Panda. 25.3% counting 105 votes had the same rank without previously being hurt by Panda. Only 15.18%, that is 63 votes said that they did not recover from the previous Panda penalty.
- As per the Panda 4.1 poll, it looks like there was almost 30% drop in traffic from Google organic search
So, what do you do if your site is affected by any of the Panda updates? Well, irrespective of what the type of Panda update is, you can ensure that your site recovers from it in the gradual run, no matter how deeply hit you are.
You need to separate the scraped and copied content to avoid your website from getting low ranks in search engines as low value content may slap down your entire site in spite of your unique and valuable content. Also, the article will look more trustworthy if the link is more authoritative, just like adding in an author will help you get more authority.
The healthy ratio of advertisements is equally important to keep the rank higher. However, panda update victims does not necessarily mean that it need to delete the older blogs and articles.
Track and improve the social branding, along with link building and content marketing efforts. Close study of the Panda updates will automatically help to improve the SEO. Moz analytics, Open Site Explorer, Followerwonk can help to improve the website.
It’s extremely important to improve the visibility of your website which can eventually increase the website traffic. For this, you need to be aware of the resources to broaden the SEO, know about the local search ranking factors, encourage and manage the online reviews. Needless to say, the social media strategies should remain comprehensive. Also, you should stay updated about the latest researches from the experts in the online marketing field.
How accurate is Google when they are carrying out searches? This is the area Google Pigeon wants to address, improving Google’s location search results during the last Pigeon update. The aim was simple – if you were searching for a restaurant nearby, you needed to be displayed only the top 3 or 4 or 7 searches that you will find most relevant. In effect, it also meant that all other restaurants would also be losing out.
One of the major changes with the algorithm update in 2014 is that Google-Local 7 Pack (Group of 7 Google map listings) goes through certain facelifts. Though the effects of the algorithm have been noticed only in the US, but the effects to the 7 pack were noticed quickly. Organic results have gone through major changes because of this and many directories have achieved better visibility.
How do you know if you have been affected?
Google pigeon algorithm affects the local site rankings. It takes into consideration different ranking signals that determine SERP’s in local ranking in a non local search. Many web site owners found that listings for the keywords earlier in SERP’s could not be noticed. Realtors and property websites are being affected a lot and big local directories have been affected as well. It has also affect the visibility on brands in local search engine listings.
Update in 2014:
Google update rolls out to US, UK, Canada and Australia. Google has not yet released any official statement about the algorithm update; it is difficult to conclude who will suffer and who will not. If you are living outside these countries, it might still be a little time before Google comes your way but you need to be prepared about what’s coming. The problem with implementing the changes in every country by Google can be many including logistical difficulties and it can take years before it’s done, just like Google Maps is still a work in progress. However, there are a number of precautions which one may consider to be affected less.
What do you need to avoid?
Over optimization of the websites should be avoided and local listings should be attached to strong domains only. Location also matters with the changes in the algorithm. Creating a local Google + page is important and the area code of phone number should match the zip code of the city you are living in and the same should be registered on directories. In case of differences in the address, zip code or area code, the rankings will drop down drastically. It’s important to maintain the consistency.
What you need to do as a website owner?
As a website owner, if you have noticed a drop in your ranking, then this may be invisibility of certain local listings and to cope up with the loss, a PPC campaign should be used to cover the loss and focus on getting the web search listings for those keywords. Local rankings will depend on traditional SEO factors now like domain authority and back links. Your site’s general SEO characteristics lag and you should and a thorough competition research will be required.
Moreover, find the most influential directories in your niche like Yelp can make sure your business is listed in these directories. A local website which is well optimized and compatible with smart phones and other gadgets have more chances to get good rankings. Medium businesses should build authoritative links for better rankings as well.
Announced on April 24, 2012, Google Penguin is a series of algorithm updates which targets at providing better search results for those who use spam methods to get back links. The Google Penguin updates will identify those websites that violate Google’s Webmaster Guidelines.
What sites does it affect?
As the Google Penguin in not niche specific, the Penguin updates are designed to find all offending websites and downgrade their search rankings. With too many links from low quality pages, your website rank can drop drastically.
Updates in 2014
In 2014, Google Penguin 3.0 was updated on October 17. This is the fifth update of the initial Penguin algorithm. After a lot of fluctuations for a few weeks, the update died out a bit. However, massive changes were seen on Thanksgiving Day when Google actually confirmed that it was part of 3.0.
Ater the Penguin 3.0 on October 17, 2014, Penguin 3.1 was released on November 27, 2014, and Penguin 3.2 on December 2, 2014, with Penguin 3.3 releasing quickly on December 5, 2014 and the Penguin 3.4 on December 6, 2014. Of course, with all these updates, website owners could find wide spikes in website hits and then reversed every now and then.
The frequent number of updates have many asking about what Google exactly wants from them. Since the latest Penguin update started rolling in from October 17th, the update has influenced less than 1% of English queries however it might have made smaller or greater impact in other languages. As Google confirmed on December 2nd, the changes that were a part of the Penguin 3.0 rollout are still “on going”.
Things to avoid
After a complete understanding of Penguin’s target and your site’s position there are few things that needs your attention. To avoid the Penguin penalty website administrators should conduct a complete link audit of the website to start with.
The profile link audit should focus on the facts that all links from guest blogging networks should be removed. Also, the exact match anchor links should be removed to avoid the penality. Not only this, all of the optimized anchor links should also be removed. You should request all the guest post sites to no-follow the links back to your site. And last but not the least, all the spam sites should be removed immediately.
What you need to do as a website owner?
As a website owner, you need to know if you’re going wrong somewhere. Applying White Hat strategies are an absolute must or Google will find out sooner or later. Unfortunately, when Google does find out about your tactic you would be penalized, heavily.
There are comprehensive social media guides available which can help to build your social presence and increase the visibility of the website. Link building, an added SEO skill is another quintessential part for you to know as a website owner.
Getting yourself ready with your website
Be ready with SEO blueprint, SEO cheat sheets, keyword targeting and on-page optimization to enjoy the best ranking. With Moz Analytics, Open Site Explorer, Followerwonk etc., you can improve on social, branding, link building, and content marketing. You have to be equipped with the robust toolset for SEO, social media analytics, rank checking to make your business successful.
More, Google is now experimenting with different new ranking factors. In a recent post in November 2014, we had Google stating that it was looking at different mobile ranking factors.
Yes, Google has a mobile ranking algorithm that ranks websites higher if they are mobile friendly. But you probably knew about it. If you have a site that’s not optimized for mobile viewing, 2014 started the trend that’s not going to end- your site rankings will be affected.
If you are still looking for the exact algorithims Google uses to gauge your websites, mobile friendly test, do not forget to check out the new mobile friendly test application by Google. It gives a comprehensive view of everything that they are looking for including UX features. Also, keep a check out for the mobile usability report within Google Webmaster Tools that will help you know the areas you need to improve upon, in order to get a higher tanking on the site.
Here is a small roundup of features that Google might look at your site.
- Whether your website supports Flash or not.
- How optimized is your device in various devices?
- Do your readers have to zoom in to read the content all the time or how much does the content score on user readability?
- Do you have too many links within your site?
- How good is the user experience? This encompasses everything from easy to touch elements to other buttons and visual designs.
Surging Ahead in 2015
In spite of the Google updates, we still have marketers unable to comprehend a lot about how to get themselves higher on local search. Studies and reports indicate that the right SEO strategy can increase engagement rate exponentially.
Source: Local Search Association ad performance insights database (2014)
A new report (reg. required) sponsored by SIM Partners and written by Forrester Consulting states the confusion faced by marketers.
Source: SIM Partners, Forrester (December 2014)
A problem that lies with the study is its sampling size – it’s too small. However we can tell that they fairly represent the general outlook on the matter as well, after looking at previous reports.
Here is the detail about another 2013 study, conducted by the CMO Council and sponsored by Balihoo.
Source: Balihoo, CMO Council (February 2013)
Source: Balihoo, CMO Council (February 2013)
About other search engines
The Bing Dilemma
Google is ubiquitous whenever it comes to the business of online searching. However, if you are appointed with the responsibility of Bing’s success where you need to make a mark on the market share, there are three basic strategies that should be followed –
- out-feature the competition
- develop the brand
- go on the attack
To make it different from the existing search engine, you need to out-feature Google by adding exciting new features to the search results. As per a research, most of the searchers, almost 62% of them do not like social media results within the search results. So when Bing tried to add Facebook to their search engine results page as one of their tactics, seemed to be a solution.
- Develop the Brand
Product placement and traditional media advertising can help to make the Bing brand top of mind. For instance, after appearing in the popular TV shows and movies like Spiderman and Source Code, Bing has made its placement quite prominently. These placements do not come for free; especially when it appears on blockbuster movies like the ones stated above and can run in millions of dollars.
- Go on the Attack
If both the above steps fail, the last step is to go for the attack. Known as their “Scroogled” campaign, Bing has been undergoing this strategy over the last couple of years. This was originally focused on Google’s shift to a pay-for-play shopping search engine. Talking of the last year studies, Microsoft has become aggressive with new campaigns on tablets, mobile and even on the Chromebook.
As comScore reported, since 2010 Google’s search engine has seen flat results. The other market share data sources also showed similar flat results for Bing and online researchers still prefer to “Google” rather than “Bing” while searching. Thus by all accounts, the above strategies did not work.
So obviously the question rises what would be the next step after attack if the building, branding and attacking steps fail this drastically. Also, just to add to the factual list, Microsoft’s online division loses more than $10 Billion since 2005.
Let’s look at the three basic options:-
- Keep the fight on:-
As the Forrester Research shows, the size of the search industry would be more than $33 billion by 2016. This is almost half of all of Microsoft’s annual revenue. Known as “fight the fight” approach, it will dictate the fact that Bing will continue to plough forward and eventually might succeed in its attempt to seize market share from Google.
- The skeleton approach
In this approach, Microsoft reduces the maintenance expenses to the lowest level for the search engine. However, it does not devote major dollars to new customer acquisition or Resource and Development. Microsoft is essentially giving up on new customer acquisition excepting a few catastrophic stumble by Google.
- Giving up
With the new CEO at Microsoft, there might be some drastic changes coming from Bing in the near future. With the online division still seems to lose billion dollars a year, Bing might see new tactics to grab the market share.
One of the most important and most evolving tech-art is SEO and it is being constantly changing its dimensions to improve the searcher experience. Let’s have a look at the industry leaders who discussed about the prediction for SEO 2020.
Eric Enge: although there will be strides in understanding the new websites and app experiences, the basic elements of SEO will remain the same. The human need to look for answers and even for serendipitous experiences, statistical probability analysis will need to have the search engine by 2020 in a way that can satisfy both of the needs. And the real part of the job will lie in doing easier things for search engine to understand the content and deliver great user experience.
Ann Smarty: in common terms of SEO, there has not been any drastic change since 2008. So by 2020, “free search traffic” will be a past thing and it might be impossible to predict search rankings in the future. Internet marketing companies need to accept the PPC management if they haven’t done already. Proactive link building might become too risky although they are now perfectly under white-hats. Creative app promotion tactics should be the next look-out for the companies.
Aleyda Solis: both semantic and conversational search functionalities and capabilities might change vastly by the next few years. Different technical abilities along with new strategies and tools will help to grow and maximize the organic search visibility and connect to the existing as well as potential customers. The existing “search ecosystem” in 2020 should consider these goals.
Marcus Tandler: the search will be more local and a lot more personal by 2020. And Google will be almost omnipresent, with its effort to predict your choices, questions, interests even before the user think about it. Although the websites will be there, by 2020 the organic search index will try to give more straight and correct answer.
Rand Fishkin: it’s probably impossible to predict about web in 2020. Maybe blending of media, entertainment and advertisement will lead to higher expectation in content.
Purna Virji: keyword data would practically become extinct with the influx of advanced devices and personalized platforms. The user experience and intent will be the centre for the SEO teams and have to employ different customer segments to keep the users engaged. Cross-fucntional SEO will be in practice.
Mike King: Google search will get all the more specific and for brands, it will be necessary to be contented with social connections.
Casie Gillette: thinking beyond the traditional search engines like Google or Bing, companies need to understand their audience. Technical audits, writing content and building links will still be there but in a different way.
Dan Petrovic: 2020 SEO will be a complex platform and there will be several sub disciplines to it. Conversational search, Internet of Things (IoT), Google as a personal assistant and Predictive search and answers should be the core ideas to pay attention to.
Stephan Spencer: the traditional ideas like “link building” might change and become more sophisticated. With the search engines advancing at an exponential rate, it’s no wonder that they will be powered by Artificial Intelligence.
The arrival of the Google Translate app
You must have heard the popular anecdote about Google Translate online service where one says why learns a language when you have probably the largest translation service globally. The best feature is that it fights the spammers the auto translate content as well with this service. The auto translation app was available with Google Play which is also added in the Chrome web store.
What is the Google Translated Content?
One of the major hurdles for a website is the cost of writing. Long tailed keywords and subjects might not just pay off for the site. Thus, using Google Translate in most cases, more and more websites got into automatic translation which helped to translate the original content into as many languages as possible. One strange outcome was the fact that Yiddish became very popular despite being the language for mainly Jewish ultra-orthodox community.
Sites with auto translated content are penalized by Google
The automatically generated content is considered as spam. in fact some of the sites who monetize their content using Google AdSense and many of them have received the email stating Google AdSense account been disabled based on the gibberish/auto-generated content.
Auto-translated but non-spam content
Google has an internal tool in the web store which is superior to the public online tool which takes care of the basic errors done by the automatic translation tool.
Identifying Names – the name of the games, person or cities are mistakenly translated by the Google translate into something of its equivalent term. Imagine how strange it would sound if the brand “Apple” is translated into “Manzana” which is the spainish for apple fruit!
Confusing and understanding between Masculine & Feminine: the Google Translate has come up with sentences like “she is a great game” while translating a Hebrew conversation however, with the internal translation tool of Google, it says “it is a good game”.
Spelling mistakes: there are instances when there are numerous mistakes in the translation done by the public tool while the internal tool has much less number of mistakes.
Fluency: the internal tool makes much better decisions than the public tool and if compared, the internal tool translations are easily readable rather than the public tool.
Obviously the question arises that why Google hesitate to release the best of the technology in translation in spite of having it with them? The closest possible answer is probably that Google is afraid of the mass spamming that would otherwise become very difficult to identify.