While marketing an online store, almost every store owner and Internet Marketer comes across questions that are so obvious that we overlook them. We assume that we know the answer but we don’t realize how they’re impacting our business.
So, we’ve put together a list of 8 such SEO questions that many of our ‘smart’ clients at I Love Fashion Retail have asked us and we will try to understand how these questions drive businesses.
- Should I block search engines from indexing product pages with duplicate content?
- Should I delete a product page as soon as it goes out of stock and is not expected to be in stock in future?
- Should I disallow/block product search result pages from crawlers?
- Is it possible to succeed in e-commerce without investing too much time in SEO?
- Should I mention the domain name in title tag of all the pages? If yes, where – in the beginning or end of the title tag.
- Which pages should I block from my robots.txt file?
- What should I include in my XML sitemap? (and what not)
- Should I add a paragraph long content to my home page and compromise on site’s aesthetics?
We will try to answer all these questions in a definitive yes/no or this/that. We understand our answers may not applyall e-commerce businesses but for the sake of achieving our objective of giving definitive answers to these tricky questions, we are taking the risk of being irrelevant for specific situations and we want to welcome your disagreement. So, here we go:
Question #1: Should I block search engines from indexing product pages with duplicate content?
Answer: Yes, you can block search engines from accessing your product pages IF the content on these pages is duplicate / copied.
Why you may want to do that? Most online retailers are resellers of products from other retailers, wholesalers and manufacturers. And in most cases, the size of inventory gathered from these sources when multiplied with specifications such as size & color, it becomes practically impossible for an e-commerce store owner to write unique product descriptions and meta tags for each product page. What happens then? The retailers upload product feed of thousands of products with duplicate descriptions and meta tags, that is exactly same as what their competitors are uploading. This is dirty duplicate content for search engines as you’re not linking back to source of content as credit (offcourse, you wouldn’t want to link out to the manufacturer as you might lose your business to them).
Solution: No, generating a mammoth XML sitemap of duplicate pages won’t help. That XML sitemap might get these pages indexed but it doesn’t mean that Google will send traffic to these pages. Do the opposite. Remove product pages with duplicate content from your XML sitemap and block the access of crawlers to these pages using robots.txt file or meta robots tag on each page. Submit product pages to crawlers in small batches as you rewrite each product page with original content.
Question #2: Should I delete a product page as soon as it goes out of stock and is not expected to be in stock in future?
Answer: No. Keep ‘out of stock’ product pages live. Don’t delete them and don’t redirect them.
Because when the page is gone (removed or 301 redirected), the organic, referral and social media traffic (from sites like Pinterest, Twitter or Facebook) will be gone. Abrupt redirects can turn visitors off. Enable an email sign-up box and collect email addresses of customers for future selling.
Question #3: Should I disallow/block product search result pages from crawlers?
Answer: Yes, you must disallow product search result pages as it’s duplicate content and is value only to human visitors.
For bots, these pages are nothing but repetitive content from categories. To make it worse, some retailers link to such pages from their navigation and use them as a category page (especially when they want a new category page as a PPC landing page, which doesn’t exists in the navigation otherwise). This can be damaging for your store’s organic SEO. So make sure that you don’t link out to search result pages from your website’s navigation and exclude these pages from your robots.txt file using wildcards.
Question #4: Is it possible to succeed in e-commerce without investing too much time in SEO?
Answer: Yes, it’s possible if you have ability and passion to create inspiring content. We have seen quite many retailers wasting their time in over engineering SEO; the time which they can invest on producing inspiring content. Once you have ensured that there isn’t anything wrong with your store’s indexing (there are no roadblocks for search engines), you must stop yourself from wasting time on un-productive SEO, such as performing keyword research and writing meta tags for product pages with duplicate content. You should rather spend that time in producing rich content that can attract visitors, backlinks and social media following. If you’re taking care of content, you’re taking care of Search Engines as well and you won’t have to waste time in writing unnecessary, fancy wild card directives to seduce crawlers.
Question #5: Should I mention the domain name in title tag of all the pages? If yes, where – in the beginning or end of the title tag.
Answer: No. You don’t have to mention brand name in the title tag of every product and category page.Don’t waste title tag space by inserting domain name in it on your product pages and category pages. We have seen so many online retailers wasting title tags space by mentioning their domain name on every single page of their store. Instead, you should just mention the domain name (with or without a TLD) in the beginning of the title tag on just the homepage and CMS pages. Don’t worry about using domain name in the title tags of category or product pages; you have description tag for that.
Question # 6: Which pages should I block from my robots.txt file?
Answer: The robots.txt file restrictions depend on your store’s platform, structure and what content you want to hide from search engines. However, there are pages, common in all e-commerce websites which should be blocked and also ones which shouldn’t be blocked. Here is a quick list:
|Pages/files you must block from Robots.txt||Pages/files you must never block from Robots.txt|
|Product search result pagesThese pages are canonical urls with repetitive content meant only for users, not crawlers||ImagesDon’t disallow Google from indexing your images, until you mind seeing Google images as traffic source in your Google Analytics.|
|Admin directories and pagesTotally. Make sure your store’s admin pages are blocked if you’re linking to these pages from your HTML.||Category PagesYou might argue that why can’t we exclude category pages as they too have content that too is copied from product pages. Don’t doubt in Google’s intelligence by hiding category pages. Google understands the structure of a typical e-commerce website and doesn’t mind indexing and ranking a category page.|
|Demo Server LocationDo you have a demo server where you test the design and functionality before it goes live (for example, yourstore.com/demo or demo.yourstore.com)? Make sure you block these locations from Robots.txt file as these locations are mirror duplicates of your website.|
|Product pages with copied contentAs explained above, don’t offer search engines any pages with copied content (until you can give credit link) and product pages are no exception to this rule.|
|Product tag pagesYou have already allowed Google to index category pages; it’s better not serving tag cloud pages to Google. Let Google focus on pages with value.|
|Personal / sensitive informationLast but not the least, you should make sure that you must block any pages with information which you don’t want people to find by searching Google.|
Rest depending on your store’s platform, you might consider disallowing directories such as comments, include, feed, Skin, CGI Bin etc.
Note: Remember, excessive number of blocked pages due to robots.txt restrictions can negatively impact your organic traffic as your inbound link value can get compromised. Incorrect robots.txt directives can restrict link juice flow on your store.
Question # 7: What should I include in my XML sitemap? (and what not)
Answer: We all know that we can’t serve everything to search engines in our XML sitemap but we have seen lot of SEOs and store owners getting confused in what should be in the XML sitemap and what not. So, here is a quick list of what to include in your XML Sitemap and what not:
|What to include in XML sitemap||What not to include?|
|Product Pages||Product Pages with duplicate meta tags|
|Product pages with duplicate descriptions|
|Category Pages||Dynamically created category pages (with parameters like ?id, ?cat etc)|
|URLs with filters and pagination (with paramters like ?p=, ?price= etc|
|Brand / Manufacturer Pages||Brand pages with duplicate content|
|Search Result pages|
|Review Pages (as they’re getting pulled on your product page already)|
|Blog pages||Product Tag pages|
|Mobile compatible pages||Pages blocked by Robots.txt file|
|Canonical URLs of product pages|
|Pages Returning 404 or Server Errors|
Question # 8: Should I add a paragraph long content to my home page and compromise on site’s aesthetics?
Answer: You can add any length of content to your home page if it pleases your customers.
Don’t add content to home page to please search engines. Google doesn’t expects to see a lot of content on the home page anyway. It understands that a home index page is mostly made of navigation links and images. So, if you want to go with an image and navigation only home page for aesthetic for branding reasons, don’t stop because it will affect your store’s performance on Search Engines. It won’t.
This list is by no means is exhaustive; there are questions no one knows the answers to. If you disagree to any of our answer here, please feel free to express your disagreement or add a new question to this list in comment section below.
Pulkit Rastogi, Founder & Ecommerce Consultant
Specializes in Fashion Ecommerce – Customer Acquisition, Retention, Conversion Rate Optimization and Brand Positioning. Published Writer & Amateur Ruby on Rails Programmer.