5 common SEO problems with Nigerian Businesses and how to fix them

·
April 18, 2017
·
9 min read

SEO problems are a common illness for every CEO, Nigerian business owners are not left out. Many Nigerian business owners have always had the feeling that their businesses have search engine optimization problems, but don’t know what exactly the SEO problems are.

So, instead of letting you resort to random search engine optimization tips bloggers preach for traffic sake or contract SEO ‘Nazis’ of this world to solve your SEO problems with their magical techniques, I chose to show you 5 Common SEO problems with Nigerian businesses and teach you how to find plus fix them.

What is SEO (Search Engine Optimization)?

seo keyword
Photo Credit: therichbrooks via Compfight cc

Search Engine Optimization involves on-page optimization with which you fine-tune your website’s HTML snippets for search engines to understand what are most important web pages and content on your website. And, off-page optimization helps you create, tweak and promote content on your website which other websites, social media influencers and your industry mates can use to link to you.

SEO is the organic part of Search Engine Marketing. On-page and off-page optimization are the two ‘shidwen’ of their daddy, Search Engine Optimization.

Advertisement

You must do two of them, to achieve significant success in organic search ranking.

SEO is organic, you don’t raise budget to outsmart competitors like in PPC for organic ranking. Rather, you have to be technically sound enough to troubleshoot your SEO problems and communicate appropriately with your developers.

Also, you have to be creative enough to strategise effectively as you reach out to others or expect them to bless you with social shares, blog comment, backlinks and so on.

What are common SEO problems with Nigerian Businesses?

I am not offering you some random search engine optimization tips here. I will show you 10 of the most common SEO problems that Nigerian websites do have.

You will identify by yourself most common SEO problems with Nigerian businesses, confirm if your website has any of these SEO problems. And then, you tackle them by yourself.

Be the smartest in the room

Join 30,000 subscribers who receive Techpoint Digest, a fun week-daily 5-minute roundup of happenings in African and global tech, directly in your inbox, hours before everyone else.
Digest Subscription

Give it a try, you can unsubscribe anytime. Privacy Policy.

I’m hoping you can!

Number 1 SEO problem with Nigerian businesses: Website Index

Website index (or web indexing) is the first SEO problem any CEO or online marketers should look out for. Does Google have pages of your website indexed in its ‘archive’?

Google, periodically deploy spiders to travel around the Internet to scout for fresh content. When new content is found, it uses HTML tags on the respective websites to ‘file’ your web pages and web content in the appropriate ‘folder’.

Before your target audience can find you online via organic search, Google search engine must have crawled your website and indexed your web pages and web content.  It is what search engines have crawled and indexed that will be displayed to people when they search.

How can I check my Website Index?
  • Log on to www.google.com
  • Type in ‘site:yourwebsiteaddress
  • Press ENTER

Example: I typed in Google Search bar site:wdc.ng

common seo problem 1, website index

You can see some of the indexed pages… 192 pages of wdc.ng are currently indexed.

Just like that, you can find out if your website is indexed. One common SEO problem solved, right?

Let’s go!

Number 2 SEO problem with Nigerian businesses: Meta Robots ‘NO INDEX’ tag

META ROBOTS ‘NO INDEX’ is a common SEO problem that Nigerian businesses have. It is, in most cases, an oversight by web developers. They use this source code command to keep a new website, being worked on from search engines.

Web developers want the website they are currently working on to be accessible to them alone or their clients. So, they use Meta Robots ‘NO INDEX’ tag to keep them away from search engine spiders.

How is META ROBOTS ‘NO INDEX’ an SEO problem?

Meta Robots ‘NO INDEX’ tag is a HTML Tag that tells search engine algorithms to keep off certain web pages. Beyond being mistakes by web developers, META ROBOTS can be set to NO INDEX so that very sensitive web pages can be protected from spam bot or other fishy robots crawling the Internet.

You may be wondering how realistic it is to keep Google and other crawling bots from certain web pages on your website. Some web pages that do not add value to your search ranking should be blocked. Some sensitive web pages that contain vital information that are not meant for public consumption can be blocked by setting META ROBOTS to ‘NO INDEX’.

Other web pages you want to use META ROBOTS ‘NO INDEX’ to block are onsite search results pages, auto generated web pages plus other password/card details web pages.

Outside of these web pages that you are advised to block crawling bots from indexing, ensure the most important web pages on your business website are not keeping search engine bots away.

How do I check META ROBOTS ‘NO INDEX’?
  • Open the web page in your browser.
  • Right click on your mouse.
  • From drop-down, click ‘Inspect’
  • Control Find ‘Meta’

Look for ‘META NAME= ‘ROBOTS’ CONTENT=’INDEX,FOLLOW,NOODP,NOARCHIVE

common seo problem two, checking meta robot index

If the line in your source code looks similar to this, search engine bots are permitted to index the web page  and links from the web page will pass your authority to website you link to from this web page.

You want to know about ‘NOODP, ARCHIVE’ or your line looks different? Ask your developer if what you see isn’t super cool, like this.

Number 3 SEO problem with Nigerian businesses: Site Speed

speed

Site Speed (or page load time) is how long it takes to fully display web content of a particular web page in a web browser.  Site Speed affects how fast readers land on your website, after clicking through.

47% of consumers expect a web page to load in 2 seconds or less, while 40% of people abandon a website that takes more than 3 seconds to load. -- Kissmetrics.

How does Site Speed create SEO problems?

You may have heard that, here at Google we’re obsessed with speed, in our products and on the web. As part of that effort, today we’re including a new signal in our search ranking algorithms: site speed.

Site speed reflects how quickly a website responds to web requests. -- Google.

You got that straight outta Google.

How do I check my Site Speed?

You could lose organic search ranking if your website takes forever to load. The more readers get annoyed, abandon your website, the higher your bounce rate. High bounce rate signals terrible user experience to Google.

Larry Kim, CEO of WordStream, did extensive experiments on how user engagement metrics have significantly been affecting organic search results. Should you need proof that user experience like Page Load Time matters to your organic search ranking, read up his 6 SEO Experiments That Will Blow Your Mind.

Number 4 SEO problem Nigerian businesses have: Robots.txt file

Robots.txt file is a file kept at the root of your website. It is used to explain sections of your website to different types of crawlers. It uses protocol called Robot Exclusion Standard to instruct search crawlers to exclude your web server and other sections.

Robots.txt file works similarly to META ROBOTS ‘NO INDEX’ which I mentioned earlier, but it is completely different. Google even warns that you should not use Robots.txt file to do what META ROBOTS ‘NO INDEX’ is supposed to do.

Robots.txt should not be used for hiding web pages from search crawlers,  Use META ROBOTS ‘NO INDEX’ on the particular web pages instead.

If you do, search crawlers can still get to the web page you are trying to hide through other web pages linked to them. Do you get?

How can Robots.txt cause SEO problems for Nigerian businesses?

Robots.txt can become SEO problem if ‘User Agent’ is set to ‘Disallow’. There are other coding mistakes like mis-use of * and / in the Robots.txt file. So it is a file you should check and be sure it is fine.

I am aware every Nigerian business owner is not a web developer and not everyone wants to be bothered about programming bugs. However, the Robots.txt file is easy to check.

I will show you how to check Robots.txt file now and you will see what a perfect Robot.txt file looks like. If your Robots.txt file looks different or Useragent is set to ‘disallow’, there is trouble.

How do I check Robots.txt file?
  • Open your web browser
  • Type in ‘yourwebsiteaddress/robots.txt
  • Press ENTER

Example: I typed in ‘www.wdc.ng/robots.txt’. See result below:

seo problems robot file

You see that my ‘User-agent’ is set to ‘*’?

If your source code looks different, get in touch with your web developer.

Number 5 SEO problem with Nigerian businesses: URL Canonicalization

Canonicalization is the process of picking the best url when there are several choices, and it usually refers to home pages. Google’s Matt Cutt said so.

URL Canonicalization is a very popular SEO problem among Nigerian businesses. I have noticed the same website having ‘www.abc.com’ and ‘abc.com’ as web address. Have you seen one too? It feels normal, right? Well, it is not normal.

How does URL Canonicalization create SEO problems for Nigerian businesses?

Simply put, if you have more than one correct or valid URL leading to your Home page (to be more specific), you may end up having same web pages with different URL, then Google search does not know which URL should rank better.

If you are cool with having  ‘www.wdc.ng’ and ‘wdc.ng’ for your website at the same time, without resolving to a particular one, as you create new web pages, you will have something like ‘www.wdc.ng/new-page’ and ‘wdc.ng/new-page’ leading to the same web page, same content at the same time

Then, search engines will have problem with which of the two web pages should get link vote, social share, engagement juice, as both of them will be like ‘photocopy’ to them.

Choose your preferred URL, using canonical URL, and stick to it. Be faithful to one URL, like your partner.  No stone thrown, just saying!

Anyways, if you find you have more than one valid URL leading to your home page, contact your web developer right away. Tell him or her to use canonical URL, which is a HTML link tag with attribute rel=canonical, to marry your multiple URLs.

Doing this will inform search engines that those different URLs are united (not Manchester though)

How do I check for canonical URLs?

It is not all the time that multiple URLs leading to the same web page is the 'work of the devil' though. Sometimes you have a very popular content that appears on different sections of your website or a product page that shows up on different websites.

So, if it happens, resolve the multiple URLs with canonical URL tag and move on.


About the Author: Adeyemi is a digital marketing expert certified by Digital Marketing Institute, Ireland, HubSpot Academy and YouTube Brand Partner on Content Strategy.

He serves engaging content for SEO, Email Marketing, Social Media and helps businesses with strategy for optimizing YouTube video content for best ROI possible. So, you have business objectives to achieve at awareness, consideration or conversion stages of your marketing funnel?  Talk to him.

Subscribe To Techpoint Digest
Join thousands of subscribers to receive our fun week-daily 5-minute roundup of happenings in African and global tech, directly in your inbox, hours before everyone else.
This is A daily 5-minute roundup of happenings in African and global tech, sent directly to your email inbox, between 5 a.m. and 7 a.m (WAT) every week day! 
Digest Subscription

Give it a try, you can unsubscribe anytime. Privacy Policy.

Other Stories

43b, Emina Cres, Allen, Ikeja.

 Techpremier Media Limited. All rights reserved
magnifier