Warning: sizeof(): Parameter must be an array or an object that implements Countable in /home/kbwebconsole/public_html/wp-content/plugins/article-taxonomy/article-taxonomy.php on line 772
- +Getting Started
- +Documentation
- +Web Pages
- +E-Commerce
- +Product Catalogue
- +Members
- +Email
- How do I add a new email address?
- How do I administer an email account?
- How do I check my email from webmail?
- Common email settings
- How do I configure Outlook 2013
- How to configure Outlook 2010?
- How to configure Outlook 2007?
- How to configure Outlook Express?
- How to configure Windows Mail?
- How do I check my emails in Thunderbird?
- How do I setup Netscape Communicator?
- How do I check email in my iPhone?
- How do I check email in my BlackBerry?
- How to setup Entourage for Mac
- How to configure Mac Mail?
- How do I add an email signature? (Outlook, Thunderbird)
- Mail Quota Warning
- How do I setup anti spam configuration?
- How can I download my email but keep a copy in server?
- Why am I losing important emails?
- Avoid "timeout" errors when checking email
- Setting the Root Folder Path for your IMAP account
- Email Setting for Bork
- +Site Styles
- +Site Admin
- How do I change my site configuration?
- How do I change my contact details?
- How do I change my password?
- How do I add Google Analytics?
- How do I add a Google Map?
- How do I hide and publish my site?
- How do I change the layout of my template?
- How do I see my site statistics?
- How do I use Meta tags?
- How to add social media?
- How to use robots.txt?
- How to manage custom file
- +Troubleshooting
- +FAQ
- +General
- Can I have different style or template on each page?
- How do I change my website appearance or templates or style?
- How do I create three levels of navigation?
- How do I use a proxy server?
- How to disable or enable tax display?
- How to export order into .csv format?
- How to create a member protected page?
- How do I restrict access to my catalogue?
- How do I send email to members?
- How to Forward Emails to Multiple Addresses?
- How to forward emails "As Attachment" in Outlook.
- How to Enable SMTP Authentication in Outlook
- How to enable snippet in the website?
- How do I add custom code to my website?
- How do I create a desktop shortcut of WebAlive Console?
- How to Clear Your Browser's Cache
- How to upload favicon ico
- How to enable Re Captcha
- How to send files to WebAlive
- What is this message “click to activate and use this control”?
- How can I remove old content or URL from search engine (Google, Bing, and Yahoo)?
- +How to use the Editor
- The text editor
- How to do copy & paste in text editor?
- Why can't I upload my photos?
- How do I add an image to my site?
- How do I make an image pop up?
- How to update alternative text on images?
- How do I create a hyperlink?
- How do I upload a file for people to download?
- How do I add multimedia (YouTube video and flash file)?
- How do I remove inline style
- Why is my new line gap too big?
- How do I create and work around a table?
- How do I improve the layout of my page using a table?
- How to remove or hide table borders?
- What makes a good web page?
- How do I make a consistent page?
- How do I copy content from an existing page to a new page?
- +General
About /robots.txt
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.
It works likes this: a robot wants to visits a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt, and finds:
User-agent: *
Disallow: /
The “User-agent: *” means this section applies to all robots. The “Disallow: /” tells the robot that it should not visit any pages on the site.
There are two important considerations when using /robots.txt:
- robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
- the /robots.txt file is a publicly available file. Anyone can see what sections of your server you don’t want robots to use.
So it is not recommended to use /robots.txt to hide information.
How to edit /robots.txt file from WebConsole:
Login to the Console using your domain name and password. From the Console screen click Administration -> Edit Robots.txt
What to put in it?
The “/robots.txt” file is a text file, with one or more records. Usually contains a single record looking like this:
User-agent: *
Disallow: /uploadedFiles/
Disallow: /xml/
In this example, three directories are excluded.Note that you need a separate “Disallow” line for every URL prefix you want to exclude — you cannot say “Disallow: /uploadedFiles/ /xml/” on a single line. Also, you may not have blank lines in a record, as they are used to delimit multiple records.
Note also that globbing and regular expression are not supported in either the User-agent or Disallow lines. The ‘‘ in the User-agent field is a special value meaning “any robot”. Specifically, you cannot have lines like “User-agent: *bot“, “Disallow: /AddToCart/*” or “Disallow: *.jpg”.
What you want to exclude depends on your server. Everything not explicitly disallowed is considered fair game to retrieve. Here follow some examples:
To exclude all robots from the entire website
User-agent: *
Disallow: /
To allow all robots complete access
User-agent: *
Disallow:
(or just create an empty “/robots.txt” file, or don’t use one at all)
To exclude all robots from part of the website
User-agent: *
Disallow: /xml/
Disallow: /tmp/
Disallow: /flash/
To exclude a single robot
User-agent: BadBot
Disallow: /
To allow a single robot
User-agent: Google
Disallow:
User-agent: *
Disallow: /
To include all files except one particular folder
This is currently a bit awkward, as there is no “Allow” field. The easy way is to put all files to be disallowed into a separate directory, say ‘stuff’, and leave the one file in the level above this directory:
User-agent: *
Disallow: /uploadedFiles/stuff/
Alternatively you can explicitly disallow all disallowed pages:
User-agent: *
Disallow: /uploadedFiles/junk.html
Disallow: /uploadedFiles/foo.html
Disallow: /uploadedFiles/bar.html