Showing posts with label crawling. Show all posts
Showing posts with label crawling. Show all posts

Saturday, 17 October 2015

Seven Ways You Might Be Losing Out On Search Rankings

SEO is increasingly becoming more closely aligned with user experience (UX) and content value, as Google sharpens up its algorithms to retain its primacy as a search tool. Now, you can expect to be penalized more frequently by Google in the form of less traffic. Also, you can anticipate being penalized by users in the form of more bounce for the same issues: weak content, unfriendly user design, or attempts to game Google and feed people ads they don’t want. If you feel like you’re doing everything right, but you’re still not ranking for the searches you’re targeting, maybe one of these is the problem?

1. Your Website Isn’t Optimized for Keyword Search

That’s optimized, not maximized. You want the ideal keywords and a number of keyword incidences, not the biggest collection of keywords. Stuff your website with keywords and Google will penalize you for it. Leave them out and you are neglecting one of the simplest ways to increase organic traffic.

Diagnosis:

One way you can end up in a keyword-free environment is if your headers and other key text are part of graphics because they're not crawled as text. Aside from that, it’s a pure copywriting issue and a clear sign that you need a new copywriter. Keyword optimization should come as standard.

Treatment:

In order of importance, your target keywords should appear in:

  1. Page title
  2. H1 and H2 header tags
  3. Content
  4. Meta description
A Word About This: Google doesn’t care that much about keywords anymore. Rather, it cares about key meanings. 

Let me elaborate. According to Jason DeMers, "It doesn’t matter that you used the phrase 'auto repair shop' exactly several times throughout your website. You could use 'auto repair shop,' 'car repair specialists,' and 'vehicle repair facility' on different pages, and Google could theoretically put you in the exact same category."

It’s the meaning that’s getting crawled. However, the higher up the table of importance you go, the more sense it makes to shoot for specific keywords. 

Best advice? Single keyword use for your best keyword in the title and meta description. Consider using it or a close competitor in H1 and H2, and use synonyms in body content. That way, you’re getting all three sections: fat head, chunky middle, and long tail.

2. You’re Repelling Spiders

When Google doesn’t crawl your site frequently, you slide down search rankings. And if you leave Google to its own devices, it might not crawl you for weeks. Therefore, your newly optimized site isn’t getting any more action because Google hasn’t noticed it yet.

Diagnosis:

Google uses the data that spiders report to rank pages in search. Not being crawled means search rank doesn’t get updated. Additionally, being crawled infrequently contributes to poor search rank because Google notices that you don’t update your site very often.

Treatment:

You want to entice the spider bots to crawl your site as often as possible. You can find out how often they already do it under Crawl Stats in your Google Search Console. 

Here’s how to get spiders to your site:

  1. Check server function. Slow load times and unreliable servers incur SEO penalties and discourage frequent crawling.
  2. Update your site frequently. This is one function of your blog. It's also a reason why your blog should be under yoursite.com/blog, not blog.yoursite.com. Google applies the SEO benefits of your blog to your entire site. You should also frequently update site copy if it’s appropriate.
  3. Get more inbound links. Beware, though. Quality counts more than quantity.
  4. Ask Google to crawl your site. Use Fetch as Google in Search Console’s Crawl menu. Put the URL to any of your pages in the box and Google will crawl it.
  5. Keep your sitemap updated and error-free

3. Pandas Are Devouring Your Content

Pandas are harmless in the wild. Online, they’re feared.
Websites lose serious amounts of organic traffic each time Google brings out a new Panda update. If it’s poorly written, too short, uninformative, or duplicated, Panda will chew you up. This affects not just low-quality pages, but the entire site.

Diagnosis:

You won’t be notified about algorithmic penalties. But if site-wide traffic falls around the time a Panda update is announced, Panda might be the reason.

Treatment:

Panda hunts weak content. So the first thing to do is go over your content. Clear weak blog posts and poorly written copy from your website. After this, all you can do is wait till the next refresh, as Panda rolls out at a very slow pace.

My suggestion is to preempt Panda. Rand Fishkin says, "If you can't consistently say, 'We're the best result that a searcher could find in the search results,' well, then guess what? You're not going to have an opportunity to rank." 

Aim to have all your content as good as the top-ranking pages for your target searches. That’s the floor, not the ceiling.

4. Your Links Are Attracting Penguins

Google has a habit of naming hunter-killer algorithms which threaten large percentages of your traffic after cuddly animals. If Panda hunts weak content, Penguin attacks unnatural link profiles.
When you have lots of spammy or unnatural links, it’s Penguin that will penalize your site. Bad link quality or sudden spikes in a site's link additions followed by a sudden lull will draw Penguin’s attention. Also, having too many links from the same source, such as links that are all from blog anchor text, will entice Penguin and lead to penalties. 
Links are still useful. However, relevance and quality contribute to Domain Authority, which makes those links, as Google's Matt Cutts said, "The best way we’ve found to discover how relevant or important someone is." 

Diagnosis:

As with Panda, you won’t get an email from Google; you'll just get a whole lot less traffic. Penguin hits specific pages. If traffic to certain pages suddenly drops by more than half, it’s likely Penguin.

Treatment:

Clean up your links - and then wait. If you’re hit by Penguin, you have to wait until it comes around again to recover. Worse, the problem could be several things or a combination, so work on having a natural link profile now by avoiding the temptation to artificially link build. Google’s John Mueller recently cautioned that link building was best done naturally by making it easy to link to your content.

5. Your User Experience Is Top Heavy

Top Heavy is another Google algorithm that targets certain website configurations. Google has moved into targeting websites that offer poor user experience (UX), as well as spammy text or blacker-than-black-hat link profiles. The Top Heavy algorithm targets websites that keep their content under a huge array of banners, ads, and other non-user-oriented material. If your site requires users to scroll past ads or if you don’t have much content "above the fold," Google thinks, "that's not a very good user experience" and penalizes you accordingly.

Diagnosis:

How do you know? Probably only by correlating traffic drops with Top Heavy roll outs, which are infrequent, only happening once every couple of years. This is a site-wide penalty, so traffic to all pages will drop simultaneously if Top Heavy is the culprit.

Treatment:

Basic UX rules should keep you safe. Design a decent user experience, and you won’t even feel the tailwind from Top Heavy.

6. You’re Immobile

Google gets half of its traffic, as well as half the views on YouTube, from mobile. If your site isn’t good with mobile, it’s not good with Google. People want mobile sites and it’s in any site’s best interest to be mobile-friendly. Just like other algorithm penalties, this isn’t about pushing things in a certain direction; it is about reflecting user experience in search results.

Diagnosis:

To see if you’ve been penalized for poor mobile performance, you can use the Mobile Friendly Test tool. However, you should already know if your site is mobile-friendly and if it is not, mobile algorithm updates are the least of your worries. You’ll lose users when your site loads slowly, looks bad, and doesn’t work on their mobile devices.

Treatment:

Consider a mobile-first design, especially for landing pages. Mobile accounts for just under half of web use by organic search, and this amount is rapidly increasing. A mobile-first website can look great on a desktop, but the other way around doesn’t work as well.

7. Googlers Don’t Like You

Poor quality and thin content doesn’t just repel users and attract Pandas. It also attracts Google staff who will penalize you manually. Thin content is defined as:
  • Repetitive or spun content that provides little value to the user
  • Artificially-created content
  • Low-quality guest posts 
  • Scraped articles 

Diagnosis:

There’s no need for third-party tools or tactics. The Google Search Console will just tell you if you’ve been hit with one of these. Under Manual Actions in Search Traffic, you’ll see a notification alerting you that your site has "thin content with little or no added value." Site-wide matches mean your whole website is being penalized, while partial matches mean only certain pages are affected.

Treatment:

Improve your content. Because it’s a manual penalty, you won’t have to wait until the algorithm updates. However, you will need to radically improve site content.

In Conclusion

If you’re not ranking for the searches you’re targeting, maybe you need an SEO overhaul. Perhaps the problem is design or links, or you might need to look at copy and content. 

Whichever approach is needed - and it may be more than one - the best way to get good results is to build with a user-first focused approach with an emphasis on quality content and quality links. Make the user experience a priority. That way Google won’t penalize you, and you’ll reap the benefits of higher organic search rank and lower bounce.

About the author

  • Co-Founder & VP of Marketing, E2M Solutions
  • Pratik Dholakiya is the co-founder and vice president of marketing of E2M

Saturday, 23 May 2015

Rolling out the red carpet for app owners in Search Console

Wouldn’t it be nifty if you could track where your indexed app content shows up in search results, for which queries, which app pages are most popular, and which ones have errors? Yeah, we thought so too! So we’ve equipped our freshly renamed Search Console with new reports to show you how Google understands and treats your app content in search results.

Our goal is to make Search Console a comprehensive source of information for everyone who cares about search, regardless of the format of their content. So, if you own or develop an app, Search Console is your new go-to place for search stats.

Add your app to Search Console
Simply open Search Console and enter your app name: android-app://com.example. Of course, we’ll only show data to authorized app owners, so you need to use your Google Play account to let Search Console know you have access to the app. If you don’t have access to your app in Google Play, ask an owner to verify the app in Search Console and add you next.

Connect your site to your app
Associating your site with your app is necessary for App Indexing to work. Plus, it helps with understanding and ranking the app content better.

Track your app content’s performance in search
The new Search Analytics report provides detailed information on top queries, top app pages, and traffic by country. It also has a comprehensive set of filters, allowing you to narrow down to a specific query type or region, or sort by clicks, impressions, CTR, and positions.

Use the Search Analytics report to compare which app content you consider most important with the content that actually shows up in search and gets the most clicks. If they match, you’re on the right track! Your users are finding and liking what you want them to see. If there’s little overlap, you may need to restructure your navigation, or make the most important content easier to find. Also worth checking in this case: have you provided deep links to all the app content you want your users to find?

Make sure Google understands your app content
If we encounter errors while indexing your app content, we won’t be able to show deep links for those app pages in search results. The Crawl Errors report will show you the type and number of errors we’ve detected.

See your app content the way Google sees it
We’ve created an alpha version of the Fetch as Google tool for apps to help you check if an app URI works and see how Google renders it. It can also be useful for comparing the app content with the webpage content to debug errors such as content mismatch. In many cases, the mismatch errors are caused by blocked resources within the app or by pop-ups asking users to sign in or register. Now you can see and resolve these issues.

To get started on optimizing and troubleshooting your own app, add it to Search Console now. If you want to know more about App Indexing, read about it on our Developer Site. And, as always, you’re welcome to drop by the help forum with more questions.

Posted by:
Hillel Maoz, Engineering Lead, Search Console Team (favorite app: Flipboard) and
Mariya Moeva, Webmaster Trends Analyst (favorite app: Spotify)

Original Article: http://googlewebmastercentral.blogspot.in/2015/05/rolling-out-red-carpet-for-app-owners.html?m=1