How to optimize your Zoho Sites website for search engines

By | August 26, 2019

Welcome to Zoho Sites Your Zoho Sites website can be optimized for
search engines I’ll show you how, Go to Manage — Settings Find SEO on the left hand side. Click. Here’s where you give the inputs for optimization,
which could have a notable impact on how your website is treated by search engines You have to enter the title, the keywords
and the description Let’s take a look at an example of a search
result Here’s how a Google Search result appears
to users The three main areas of importance are the
title, the URL and the description Getting back to filling in the details.. Against page you can choose whether you want
to optimize the entire site or a page on your website I’m going to select a page, Salads The title of your page is enclosed in title
tags A good title is one that is unique, readable
and the keyword should preferably appear within the first half of your title So once that is done, you move onto keywords A keyword is the term that people search for,
in order to arrive at a website. Each page has its own focus terms and the
title, description and URL of that page depend on, and contain keywords. The more specific your keywords are, the better.
So if my page is about Salads, here are the keywords I would use.. salads, salad recipes,
healthy salads Include variations of your keywords, avoid
wrong keywords, definitely avoid spam The next thing is description Description appears below the title on search
results, it should be short and should support the claim you’ve made in your title. A description should generally be about a
160 characters long Users are more likely to click on a search
result that contains the term they just searched for. That is why the title and description draw
attention if and when they contain keywords. Once you’ve given the title, the keywords
and the description, just hit save. The next thing we look at is crawlers Crawlers are the bots that crawl your website
and in Zoho Sites you have the option of giving robots.txt for crawlers So what is robots.txt or what exactly is the
role of a robots.txt Your robots.txt will contain two important
sections One is which bots are allowed to crawl the
website and the second is, which pages on the site should not be crawled It has two commands. One is user agent and the other is disallow User agent is to specify the bot to which
the instructions apply And Disallow is to specify the pages that
are restricted I’ll show you a simple example The star that you see here against User Agent
implies that the following commands apply to every kind of bot that crawls the site And the slash against Disallow implies that
all sub-directories in the root folder are restricted to bots. That means, that no page in the root folder
should be crawled by any bot. To show you an example I’m on the home page of my website and the
URL reads Now, if I click on the salads page you’ll
find that Home.html has been replaced by Salads.html The slash here that you see follows the dot
com and anything that follows the slash is the page or the page name If I want to Disallow any bot from crawling
the Salads page all I need to do is take it from here slash salads dot html(/salads.html) and type it or paste it against Disallow Here it reads now User Agent Star, Disallow
slash salads dot html and this means that no bot can crawl the salads
page on my website You can give multiple robots.txt entries So here’s another example Here it means that I have disallowed the Google
bot from crawling my Contact page on my website Then you hit save That’s all we have for elementary SEO Thanks for watching

Leave a Reply

Your email address will not be published. Required fields are marked *