As I’ve established in the previous two articles, search engine optimization is not magic and it is not a quick fix that will generate results overnight. It is though, absolutely essential to the success of your online marketing campaign. In my next article, I discuss the most important part of search engine optimization, Content SEO, but before I get to that, this article discusses some technical tips you can use to help improve your website’s search ranking.
Google and other search engines decide how to rank websites based on a mathematical algorithm, the content of which is a very closely guarded secret. What we do know though, is that search engines (I’ll just refer to Google from now on) use software called crawlers to scour the Internet for web pages by following links from other web pages, either within your site, or from other sites. It looks at the content of each page and makes a determination as to its purpose and value. In my opinion, the purpose of Technical SEO is really not to game search results but simply to make it easier for Google to know what your site for, making Content SEO more effective.
Following is a list of technical or programming changes you can make right away to help improve your search rankings.
Mobile is critical
Make sure you have a mobile version of your site. There are many ways to achieve a mobile friendly site; responsive design, parallel mobile sites, mobile apps. You can read more in my article “Techniques For Building A Mobile Site“. Over half of all searches are now from mobile devices and Google has announced recently that it may actually begin to rank down sites that do not have a mobile version. You need to get with your web designer right away and create a mobile friendly version of your site. You can check to see if your site is mobile friendly by running a scan through Google’s free Mobile-Friendly Test.
HTML Tags
Quite frankly, this one item is 90% of what most SEO companies do, and I believe it is almost completely meaningless. I add it in here because it is just good web design, but search engines largely ignore these. I am talking about using a combination of META tags on your pages, basically HTML tags that give your page a title, keyword and description. Each page on your site should contain all three META tags and they should be unique to that page – in other words, don’t just repeat the same keyword and description tag on each page of the site. Additionally, each page should have at least one H1 tag containing targeted keywords.
Images should have tags too: While Google has made tremendous strides in artificial intelligence technologies, search algorithms still can’t properly index images. A search algorithm can’t tell the difference between a dog and a cat in a JPEG image, for example. Therefore, we add ALT tags to images which tell Google what the content of the image really is.
Page content optimization
The targeted keywords for every page should appear on the actual content of the page at least once and up to three times. I generally try to get the keywords somewhere within the first paragraph of the page. Remember, Google is smart, you don’t just want to dump keywords everywhere. You need to write meaningful content, put your client first, but find a way to integrate the keywords within what you are writing. In the process of writing good content, you may find that you are finding new keywords to target.
Submitting your site
In order to get search engines to find your new or existing site, someone needs to link to you. To expedite the process, though, you can actually tell Google and Bing to index your site immediately, by using their webmaster tools. Go to Google Webmaster Tools here, and Bing Webmaster Tools here. In both cases, you can tell Google and Bing to go search your site, and you can use the webmaster interface to identify errors in link structure and your code and get tons of tips on how to fix those errors.
Additional documents to generate
Many SEO providers will create an XML sitemap, which is basically a single document containing links to all of the pages that make up your site. I don’t see this as having much value, but some believe that it should be done. If you have any pages on your site that you specifically don’t want search engines to index, you can use a robts.txt file to instruct crawlers to ignore those pages. Google Webmaster Tools contain information on how to easily create both of these files.
Encrypt your site
Google recently announced that they will give preferential treatment to sites that use SSL encryption. No one really knows yet how important this is, but Google obviously wants all Internet traffic encrypted at some point. You can tell if you site is encrypted by looking at the URL. Encrypted URLs start with HTTPS, while standard URLs only start with HTTP. The “S” means you are running an SSL certificate, and information flowing between the user’s computer and your web host is encrypted. Ask you web hosting company for details.
Microformats could help
Microformats by schema.org could be a huge issue depending on the type of site you have developed. Let’s say your web site serves up recipes, and each recipe is made up of paragraphs, the name, description, prep time, number of servings, ingredients, and lastly instructions. Microformats allow you to identify each paragraph for what it is. Schema.org has a microformat structure for recipes, for songs, for just about any structured data you can imagine. By adding their formatting, you are making it easier to get your content picked up not just by search engines, but sites such as Yummly or AllRecipes in the case of a recipe site, and that could be huge for increasing traffic.
This is obviously just a brief overview of the technical aspect of SEO. If it seems a little overwhelming, it definitely can be. I would humbly encourage you to hire a true SEO provider in your local area.