The Importance of Back Links & How to Gather Them

Backlinks are essential for your website if you are hoping to receive free visitors from search engines like Google, Yahoo! and Bing. A back link is counted when Website A links to Website B. Search engines recognize when one website links to another and therefore considers that website more worthy of appearing within your search results.

So to sum up, with the more back links you have pointing to your website, search engines will reward you with higher rankings.

How do I get back links?

Gathering back links (or link building as it is known) sounds complicated but there are many ways you can obtain links to your website with very little effort.

Free Internet Directories

Before there were search engines the web was filled with directories, people browsed these directories to find the website they were looking for. This has left thousands of directories on the internet such as Dmoz and Jayde, many of these are still in business and accepting new links.

The majority of these directories all work in the same way, you choose the most relevant category that suits the nature of your web site then you suggest your URL to be included. Once your link has been approved your listing will go live and you will have just achieved your first back link. Use Google to search for free website directories, there are hundreds out there.

Link Exchanges

A link exchange describe 2 websites linking to each other, so Website A links to Website B and Website B also links to Website A. This is a great way of quickly gaining back links, a free link exchange serviceis a good place to start.

Article Directories

Article directories allow you to write articles and publish them within the website’s directory, in exchange for your hard work you can include links to your website within the article. Ezine Articles is the most popular of all but there are plenty more free article sites on Google.

Link bait

Link bait is a term used to describe articles or pages on your website that are intended to attract back links from other websites. Controversial subjects, useful tools and free resources are just a few ideas, the art to bait link is about being creative – go in at an angle never thought of before.

Link bait is the most difficult form of link building but since the back links are 100% natural, the results are very rewarding.

These are just a few ways you can increase the number of incoming links to your website, there are literally hundreds more. SEOBook.com has compiled a list of 101 ways to build link popularity, the article offers some interesting ideas.

7 Smart Ways to Speed Up Your Website

Speedup-bannerThe speed of your website will be one of the major factors that play into its success. Whether you’re operating an eCommerce site or you’re developing a simple blog, visitors want pages that are fast to load. According to Akamai, a global leader in Internet performance and network development, nearly half of the people online will abandon a page if it takes longer than three seconds to display content. This percentage is double for those looking to make purchases online. Even search engines will rank your site lower in results if it takes too long to load.

Luckily, there are several things you can do that will speed up a website. While some of these may require a bit of programming knowledge, some can be done just by making a few changes in how you create content. Regardless of how it’s done, putting effort into these measures to enhance the user experience of your site can impact visitor retention as well as search engine priority. The following are seven things you can do that will help your site load up quicker on various devices. Committing to just one of these has potential to increase viewership.

1. Optimize Imagery

Optimize-Images-for-SEO.jpg

Images are one of the more prominent sources for a slow load time. High-resolution pictures can be incredibly large and take a great deal of time to render in the grand scheme of things. When you upload a large 3000-pixel wide image and insert coding to reduce the image display to 150 pixels wide, browsers will still download that larger image and shrink it to fit the code.

A way to reduce the time it takes for computers and smartphones to download this image is to size it manually before uploading it to your website. If you need a 150-pixel wide image on your site, you should create a 150-pixel wide image to upload. You should also keep images in JPEG or PNG format. Things such as BMPs and TIFFs simply take up too much memory by being more complex.

2. Reduce Plugins

Plugins can take a great deal of effort to load, especially if your site is bogged down with them. While content management systems such as WordPress and Joomla utilize plugins for customization, there should be no need to install everything that looks “cool” to add. Something you thought made your site look attractive may be hindering its performance.

Plugins can also include snippets of code that are placed into websites for other features. For example, the Twitter feed isn’t actually a plugin as the code is copy-and-pasted directly into any page on your website. However, it will still contribute to load time as it’s pulling information from a server other than your own.

3. Clean-up Framework or Themes

A lot of slow websites suffer from inefficient coding. Sites that have been up for more than a decade are often slower than others because the code hasn’t been updated. Styles and online coding has changed a great deal, and your site may be a victim of using outdated HTML practices.

Themes such as those used in WordPress and Joomla could also be cause for a slow site. The same principle applies when considering how long your site has been in operation with the same theme. If you want to speed up your website, you might want to ensure it’s using current coding practices.

4. Using a Content Delivery Network

globe_3

A content delivery network, or CDN, takes your persistent files and stores them on servers closest to your target audience. Many hosting providers are offering this style of platform as it can vastly increase load times for your visitors.

Things such as the CSS, images and some javascripts are stored on these networks reducing their access times. For instance: If there is a CDN in Miami, your Florida visitors would have faster connections with the information.

5. Optimize the Homepage

You don’t want your homepage to be too busy. Showing 30 posts on the homepage reduces its speed. Many website owners take a more minimalist approach to design for the homepage and other landing pages. Removing Flash video content, reducing image use, reducing banners and only showing a few of the most common posts are some of the more common practices.

Many have adopted the philosophy of reducing scrolling on the homepage as well. What this means is that if visitors have to scroll too far to reach the bottom of the homepage, then there is too much. This belief centers around keeping the site as basic as possible.

6. Consider Your Host’s Capabilities

Not all problems for speed can be fixed by changing coding or resizing images. In fact, your host could be playing a part in the efficiency of your website. If all else fails, you may need to find a host that has a better track record with accessibility.

You may also want to consider switching your site to a dedicated server instead of leaving your hosting provider.   For the most part, hosting accounts are stored on shared servers. This means the memory, CPU, drive usage and other contributing factors are being utilized by several websites at once. A dedicated server may cost a bit more, but it will allow your site to use most of the resources available allowing it to essentially speed up the website. Learn more about WebhostingWorld Dedicated Server Packages.

7. Use Redirects as Little as Possible

Redirects can be quite beneficial when you are building a new site, revamping existing formats or simply moving to a new server. However, the redirect can also tap your load time.

Websites that have constructed mobile variants will often use redirects in the event a smartphone or tablet has been detected. Using HTTP redirects instead of JavaScript may help reduce the amount of time a site takes to send users to the correct style.

In an age where the competition among websites is extremely high, you need to take every measure you can to create an impact on your target audience. Although content is king, speed is a must-have attribute. Do what you can to enhance the experience of your visitors as well as the search engines. In the fast-paced environment of the Internet, every second counts.

 

Changing The WordPress Site URL

Your WordPress.com blog address is what people use to access your blog. An example of a WordPress.com blog address is example.wordpress.com. This document explains how you can change the example part of the address to something else.

1. Log in to WordPress admin panel Settings > General.
2. Update WordPress Address (URL) line and Site Address (URL) lines and save the changes:

general_settings

general settings

3. You will also need to re-generate the permalinks to make sure they have the new URL in  Settings> Permalinks.
So if you had Post name, you need to switch to “Default” for instance, save the changes and then revert everything back:

permalink_settings

permalink settings

If  Dashboard  is not opening or not letting you in for some reason, you may perform the changes directly in MySQL database.

The instruction below will guide you on how to change WordPress website URL usingphpMyAdmin in cPanel.

1. Login to your cPanel and navigate to phpMyAdmin menu:

phpmyadmin

phpmyadmin

2. Choose the database which is being used for your WordPress blog and click on it.

If you are not sure what exactly database you need, check it in wp-config.php file which is located in the document root – in our case it is /public_html/wp/:

wp-congif.php

wp-congif.php

3. Click on wp-options table and edit siteurl and home fields:

wp_options_table

wp-options table

That’s it!

Learn about robots.txt file

The robots exclusion protocol (REP), or robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl and index pages on their website.
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no
means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt
is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that
you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from
coming in but the good guys will not open to door and enter. That is why we say that if you have really sensitive data, it is too naïve to
rely on robots.txt to protect it from being indexed and displayed in search results.

robots.txt

When a search engine crawls (visits) your website, the first thing it looks for is your robots.txt file. This file tells search engines what they should and should not index (save and make available as search results to the public). It also may indicate the location of your XML sitemap.

Google’s official stance on the robots.txt file

Robots.txt file consists of lines which contain two fields: line with a user-agent name (search engine crawlers) and one or several lines starting
with the directive

  • How to create a robots.txt file

You will need to create it in the top-level directory of your web server.

When a robot looks for the “/robots.txt” file for URL, it strips the path component from the URL (everything from the first single slash), and puts “/robots.txt” in its place.

For example, for “http://www.example.com/shop/index.html, it will remove the “/shop/index.html“, and replace it with “/robots.txt“, and will end up with “http://www.example.com/robots.txt”.

So, as a web site owner you need to put it in the right place on your web server for that resulting URL to work. Usually that is the same place where you put your web site’s main “index.html” welcome page. Where exactly that is, and how to put the file there, depends on your web server software.

Remember to use all lower case for the filename: “robots.txt“, not “Robots.TXT.

You can simply create a blank file and name it robots.txt. This will reduce site errors and allow all search engines to rank anything they want.

Here’s a simple robots.txt file:

User-agent: *
Allow: /wp-content/uploads/
Disallow: /

1. The first line explains which agent (crawler) the rule applies to. In this case, User-agent: * means the rule applies to every crawler.

2. The subsequent lines set what paths can (or cannot) be indexed. Allow: /wp-content/uploads/allows crawling through your uploads folder (images) and Disallow: / means no file or page should be indexed aside from what’s been allowed previously. You can have multiple rules for a given crawler.

3. The rules for different crawlers can be listed in sequence, in the same file.

  • Examples of usage

robots-allow-all

Prevent the whole site from indexation by all web crawlers:

User-agent: *
Disallow: /

Allow all web crawlers to index the whole site:

  User-agent: *
Disallow:


Prevent only several directories from indexation:

User-agent: *
Disallow: /cgi-bin/


Prevent site’s indexation by a specific web crawler:

User-agent: Bot1
Disallow: /

  • Robots.txt for WordPress
NetDNA-Blog-RobotsTxt-R11
Running WordPress, you want search engines to crawl and index your posts and pages, but not your core WP files and directories. You also want to make sure that feeds and trackbacks aren’t included in the search results. It’s also good practice to declare a sitemap. So in case you didn’t create yet a real robots.txt, create one with any text editor and upload it to the root directory of your server via FTP.
Blocking main WordPress Directories
There are 3 standard directories in every WordPress installation – wp-content, wp-admin, wp-includes that don’t need to be indexed.

Don’t choose to disallow the whole wp-content folder though, as it contains an ‘uploads’ subfolder with your site’s media files that you don’t want to be blocked. That’s why you need to proceed as follows:

User-Agent: *
# disallow all files in these directories
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/

  • Miscellaneous remarks
  • Don’t list all your files in the robots.txt file. Listing the files allows people to find files that you don’t want them to find.
  • Don’t block CSS, Javascript and other resource files by default. This prevents Google bot from properly rendering the page and understanding that your site is mobile-optimized
  • An incorrect robots.txt file can block Googlebot from indexing your page
  • Put your most specific directives first, and your more inclusive ones (with wildcards) last

GlotPress is Now Available as a WordPress Plugin!!!!

Version 1.0 of GlotPress, named “Bunsen Honeydew”, is available for download in WordPress.org’s Plugin Directory.
This is the first public release of GlotPress as a WordPress plugin, ​an alternative to the standalone version.

null

Some highlights:

  • It’s a plugin!
  • WP-CLI support
  • Integrated with WordPress’s user system
  • Easy installation through the WordPress.org plugin directory
  • Multisite support
  • For the first version the goal was to change as little as possible to get it working well. But still we had to remove some things in favour of WordPress’ API system. You can review the list of breaking changes in our wiki. If you have existing plugins for the standalone version take a look at this guide on how to convert them to WordPress plugins.