Article Information

Authors:
Eugene B. Visser1
Melius Weideman2

Affiliation:
1Purple Cow Communications, Cape Town, South Africa

2Website Attributes Research Centre, Cape Peninsula University of Technology, South Africa

Correspondence to:
Melius Weideman

Postal address:
PO Box 3109, Tyger Valley 7536, South Africa

Dates:
Received: 12 May 2013
Accepted: 21 Jan. 2014
Published: 14 Mar. 2014

How to cite this article:
Visser, E.B. & Weideman, M., 2014, ‘Fusing website usability and search engine optimisation’, SA Journal of Information Management 16(1), Art. #577, 9 pages. http://dx.doi.org/10.4102/
sajim.v16i1.577

Copyright Notice:
© 2014. The Authors. Licensee: AOSIS OpenJournals.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Fusing website usability and search engine optimisation
In This Original Research...
Open Access
Abstract
Introduction
   • Background to the study
   • Literature review
      • Search engine optimisation
      • Spamdexing
      • Website usability
Research method and design
Results
Discussion and conclusions
Acknowledgements
   • Competing interests
   • Authors' contributions
References
Abstract

Background: Most websites, especially those with a commercial orientation, need a high ranking on a search engine for one or more keywords or phrases. The search engine optimisation process attempts to achieve this. Furthermore, website users expect easy navigation, interaction and transactional ability. The application of website usability principles attempts to achieve this. Ideally, designers should achieve both goals when they design websites.

Objectives: This research intended to establish a relationship between search engine optimisation and website usability in order to guide the industry. The authors found a discrepancy between the perceived roles of search engines and website usability.

Method: The authors designed three test websites. Each had different combinations of usability, visibility and other attributes. They recorded and analysed the conversions and financial spending on these experimental websites. Finally, they designed a model that fuses search engine optimisation and website usability.

Results: Initially, it seemed that website usability and search engine optimisation complemented each other. However, some contradictions between the two, based on content, keywords and their presentation, emerged. Industry experts do not acknowledge these contradictions, although they agree on the existence of the individual elements. The new model highlights the complementary and contradictory aspects.

Conclusion: The authors found no evidence of any previous empirical experimental results that could confirm or refute the role of the model. In the fast-paced world of competition between commercial websites, this adds value and originality to the websites of organisations whose websites play important roles.

Introduction

Background to the study
Small and medium-sized enterprises (SMEs) make up approximately 95% to 98% of all businesses in most countries. They provide many work opportunities and are essential for any country’s economic growth (Samujh 2011). Information technology (IT) is also a main driver of economies. It facilitates the growth of SMEs through expansion into new markets, overcoming obstacles, allowing for quicker responses to changes in consumer patterns and allowing SMEs to compete internationally (Thurasamy et al. 2009). Internet-based technology is a significant part of IT and is an investment that drives innovation (Oliveira & Martins 2010).

In order for SMEs to sustain financial growth, they must adopt marketing systems that facilitate the buyers’ and the sellers’ decision-making processes (Layton 2011). In recent years, many SMEs have adapted to e-marketing. This allows dynamic business growth and changes the shape and nature of business by overcoming threats and creating new business opportunities (El-Gohary 2010). Because of e-business, e-marketing, e-commerce and internet user activity, SMEs often use websites to present and market products and/or services because the internet is rapidly becoming a communication, commerce and marketing medium that is changing business globally (Canavan, Henchion & O’Reilly 2007; Kűster & Vila 2011; Visser 2007).

After 1993, the release date of Archie Like Indexing for the WEB (ALIWEB), search engine development progressed rapidly. This occurred in the methods developers used to index information on the internet and the methods they used to provide searchers with the most relevant information available in the shortest time possible, given the search query.

The process of search engine optimisation (SEO) aims to improve the visibility of web pages to search engine crawlers. Today, SEO is an online marketing channel. SEO addresses design, architecture and content to allow for the better ranking of selected keywords. The results searchers receive are the search engine result pages (SERPs). They fall into two sections: organic (that occupies the primary real estate of the SERPs) and pay per click (PPC) that occupies the right-hand sides and sometimes the tops of the SERPs). Today, three major search engines dominate the market: Google, Bing and Yahoo! Of these, Google is by far the most popular (Carpineto et al. 2009). Currently, these three combined command over 95% of the world market in core searches per month (Adamo 2013).

In order to satisfy searchers’ needs best, engines must extract and present the indexed information, which is most relevant to the searchers’ queries, to the searchers for scrutiny. However, this is a complicated process because of the sheer volume of information available on the internet. Search engines must index this using search engine crawler programs and rank it appropriately using organic ranking algorithms.

The inner algorithmic workings of search engines are essential because searchers are often more interested in the quality of the search results than their quantity (Yang, Yang & Yuan 2007). Research has shown that, on average, searchers view no more than three SERPs for any particular search query. In fact, the closer any particular web page ranks to the first position on the first SERP, the higher the chances are for searchers to view that particular web page (Weideman 2009).

The research problem of this paper is that of the reduced productivity that results from the lack of guidance on the synergy between applying SEO and the usability of websites. The goal of this paper was to determine whether there is a relationship between search engine optimisation and website usability. If there is, this paper wants to determine this relationship. A model embodies this goal, which will be useful to industry and academe alike.

Literature review
Soon after search engines became popular, it emerged that people discovered between 42% and 86% of all websites through search engines (Thurow 2003). More recently, it emerged that almost 85% of all e-commerce began with queries submitted to search engines (Murphy & Kielgast 2008). These statistics make e-marketing a crucial component from a business perspective. Engines keep organic ranking algorithms confidential in order to avoid abuse (Jerkovic 2010).

An in-depth look at the literature on the search engine optimisation, website usability and spamdexing is necessary at this stage.

Search engine optimisation
SEO is the process of altering websites. It emphasises semantically themed keywords for search engines in order to improve website rankings. This, in turn improves websites’ chances of being found in SERPs (Weideman 2009).

The head Google spam engineer has indicated that the Google organic ranking algorithm consists of approximately 200 elements. This suggests that Google considers 200 on- and off-page elements, which determine the relevancy of the search results presented in the SERPs (Cutts 2010). Many industry experts speculate what these elements are, along with the weight assigned to each element based on results when they conduct their own experiments. Several authors list a number of elements that would improve organic rankings (Google 2011d; Moz 2013; Sullivan 2011; Sullivan 2012).

The methodology uses understanding search queries from the searchers’ perspectives as its basis. Search engines can attempt to determine the context of search queries, based on past searches, although this may seem almost impossible. Search engines do this by analysing the estimated time searchers spend on websites before returning to the search engines in order to submit refined queries and discussions on social networks that provide integrated and associated search engine results (Google 2011c; Rayson 2013). The foundation of the methodology uses the fact that search engines have interpretations of current indexed information as its basis. Therefore, search engines may actually be able to interpret the current indexed information better by associating search queries to the indexed content using searcher behaviour patterns. By monitoring ever-changing searcher behaviour patterns, search engines could constantly reorganise search results according to relevance (Google 2009; Sullivan 2009).

Google (along with other search engines) allows users (anonymous or identified) to share data with Google. Searchers conduct approximately 400 million searches every day on the Google search engine (Google 2011a; Enge et al. 2010). The vast amount of interpreted data search engines extract anonymously could give them enough information to make an appropriate interpretation of searched queries. Google’s predictive search functionality is only one example of how Google uses searcher behaviour patterns based on popularity (see Figure 1).

FIGURE 1: Google's predictive search functionality.

Search engines also depend on editorial judgement, or link popularity, to interpret and determine the relevance of the indexed information better. Search engines would perceive a link from a credible website as a good quality link. Therefore, they would apply a positive ‘vote’ to the website at which the link is directed. Consequently, search engines can only perceive credible websites if the links they obtain come from other credible websites. This suggests that the quality of links they obtain is more important than the quantity of links. Furthermore, they need to associate the content on the website from which the link originated, and the content on the destination website of the same link, in some way from a thematic perspective. The anchor text (keywords the actual link uses) is also important and ideally should align with the destination web page’s semantic themes and targeted keywords if search engines are to regard the link as worthy (Thurow 2008).

Search engines do yield relevant results. However, non-relevant results still appear in SERPs more often than not. In a perfect world, SEO marketers could assist search engines by ensuring that search engines apply only white hat SEO tactics to particular websites, thus ‘cleaning’ up the internet and reducing spamdexing. Unfortunately, as long as they can earn money through spamdexing, search engines will have to generate new ideas and organic ranking algorithm updates in order to discourage black hat SEO tactics. Eventually, this should reduce spamdexing.

Researchers have conducted new research recently. It involved university institutional repositories and using metadata to optimise them for search engine crawler visits. One study focused specifically on Google ScholarTM. The researchers found that, if they changed the metadata in these repositories from Dublin Core to Google Scholar’s own prescriptions, indexing rose significantly (Arlitsch & O’Brien 2012).

SEO is an online marketing strategy. The basis of the SEO strategy is to address website design, architecture and content so that search engine spiders can make appropriate interpretations in terms of the themed keywords and rank websites accordingly. Strictly speaking, the goal of SEO is to satisfy search engine spiders and organic ranking algorithms by aligning the targeted themed web page content with the semantically interpreted keywords when considering search queries.

Spamdexing
The basis of the search engine revenue model is PPC advertising. However, the search functionality is available at all times and is free for public use. Search engine success still depends on providing relevant results to searchers in the shortest time possible as non-relevant and/or slow results may deter searchers from using particular search engines. Therefore, non-relevant results could have a direct effect on search engines’ revenue.

Because of the direct association between search engine rankings and searchers’ behaviour when viewing SERPs, it becomes clear that website owners may attempt to manipulate search engine rankings to increase their businesses’ exposure. Spamdexing (also known as search engine spam) refers to websites that attempt to deceive search engines, whereby the results they provide to searchers are not relevant when one considers their search queries (Weideman 2009). Search engines regard this behaviour as unscrupulous and unsolicited.

As a result, major search engines use engineers who focus only on eliminating spamdexing. However, search engines do not publish their preventative spamdexing rules. Instead, they publish their (mostly vague) best practice guidelines. They do this to protect their organic ranking algorithms because are part of the foundation of their revenue models (Enge et al. 2010). At the same time, some search engines regularly adapt their algorithms to detect and flag websites that contain attempts at spamdexing. In Google’s case, some of these updates (termed Panda and Penguin) often have dramatic effects on a small percentage of search results (Cooper 2013; Quinton 2012). The industry watches these changes closely. They normally cause a spurt of activity because commercial concerns could experience decreases in revenue if their websites drop in position on the ranking lists.

Spamdexing can occur on two levels: content manipulation and link structure manipulation. Content manipulation is restricted to on-page factors – elements over which only website authors have control. In earlier days, search engines depended primarily on what search engine spiders could see during indexing. This allowed website authors to manipulate web page structure so that search engine spiders could not see some content and only visitors could – and vice versa. Furthermore, it often included targeted keywords and/or repeated them a number of times on a particular web page (keyword stuffing), whereby the web page content would not make contextual sense to visitors. However, search engine spiders would prioritise the keywords irrespective of the website’s semantic theme for ranking purposes (Wu & Davison 2006). This is becoming less of a problem today because search engines’ algorithms now depend more on link popularity. Conversely, this discloses the opportunity to link structure manipulation. Because of the importance of link popularity, in terms of organic ranking algorithms, one should note that link popularity manipulation is more difficult to detect than is content manipulation.

In the past, black hat SEO marketers sometimes sold apparently undetectable spam methods. However, Google engineers have indicated that one can detect spamdexing (Cutts 2007). This is an interesting statement given the recent findings of spamdexing in the Google SERPs because of link manipulation. The JC Penney fiasco provided more than enough evidence to indicate that one can regard Google’s best practice guidelines for link schemes as nothing more than interesting reading (Google 2011b). An article, which the New York Times published, discovered that JC Penney (a department store chain) obtained over 2000 links, mostly from non-related websites. They varied from nuclear engineering and property portholes to casino-focused websites, all with the appropriate JC Penney themed anchor text (see Segal 2011). Although the JC Penney website seemed to have participated in search engine prohibited link schemes, the website still maintained high rankings for a number of targeted terms in Google SERPs. This discovery convinced SEO marketers that link farms and off-topic website linking schemes still have positive ranking effects, despite Google best practice guidelines. The recent Google organic ranking algorithm update ‘Panda’ (as a direct result of the JC Penney fiasco) is another attempt to fight spamdexing and enforce prohibited link schemes.

Spamdexing is increasing significantly, simply because black hat SEO marketers can make money from it. Unfortunately, it is difficult to measure search quality because search result quality is relative to searchers’ perspectives of their search queries. Ultimately, automated algorithmic solutions may not be the best way to solve the spamdexing problem. The different search engines may actually have to collaborate in order to find an effective solution. However, it is interesting to note that the Google search engine may be responsible, to some extent, for a number of spamdexing websites. Personally owned websites, as opposed to SERPs, display Google AdSenseTM ads (which is part of PPC advertising). Websites that contain AdSenseTM ads are, in some circumstances, created with the sole purpose of obtaining visitors who could click on any of the displayed AdSenseTM ads. With every click on any of the AdSenseTM ads, Google earns money, as does the author of the website that displays the AdSenseTM ads. Therefore, certain search engines might first have to look at their revenue models and their functionality before addressing the spamdexing problem successfully (Cutts 2011).

Website usability
Visitors to a website usually have specific questions about a particular problem or need (Eisenberg et al. 2008). Although one may perceive search engines as facilitators, they must still address searchers’ needs in terms of finding the appropriate website associated with the search queries. Searchers have come to understand that search engines attempt to provide the most relevant results first. This implies that, if they do not obtain the ‘correct’ result within the first three SERPs, the remaining results will probably also be irrelevant. Because search engines do not always comprehend search queries from the searchers’ perspectives fully – often resulting in non-relevant and/or spamdexing results – searchers must frequently alter their search queries to clarify the information they need.

Figure 2 illustrates a typical searcher’s process for finding information.

FIGURE 2: Standard information access process.

Searchers seldom leave search engines if they do not meet the searchers’ information needs. The reason for this is that search engines are the facilitators and because there are not many alternatives (in layout and/or functionality) that can guarantee that they will meet the information needs. On the other hand, visitors treat websites differently.

One can categorise the intentions and motivations for visiting websites as:

• exploration
• information
• entertainment
• shopping.

Whatever reasons visitors have for visiting websites, developers create websites for information, opinion, marketing and/or for financial gain in one way or another. Website usability (WU) addresses the functional application of information about visitors’ ability to interact successfully with that information. Therefore, the goal is to remove any obstacles that could impede visitors’ experiences when interacting with websites (Eisenberg et al. 2008).

Website usability defines the quality of the visitors’ experiences because it addresses mechanical and persuasive usability problems (Visser & Weideman 2011a; Visser & Weideman 2011b). Website usability statistics have shown that there is only a 12% probability that visitors will revisit a particular website. This shows that, once a website has lost a visitor because of the lack of WU, the visitor is usually lost for good (Nielsen & Loranger 2006). Therefore, one should see WU as important (if not more important) than SEO, which, in turn, emphasises the importance of fusing SEO and WU.

Although developers ultimately create websites for visitors, the visitors may initially not be aware of the websites’ existence. Searchers are never fully aware of all the websites on the internet relevant to particular search queries. Therefore, search engines that will provide searchers with the most relevant results using organic ranking algorithms’ interpretations are necessary. Although designers create these search engine results dynamically, the results are still subjective. They depend on the organic ranking algorithms that often provide irrelevant and/or spam results.

In addition, the SERPs may not have listed the ideal website results (or listed them very low down), because those websites were not visible to the search engine spiders (or did not satisfy the organic ranking algorithms appropriately). Search engine spiders and organic ranking algorithms depend on a number of pre-programmed rules that correspond with the conceptual models of the internet. This shows that, if search engine spiders have not crawled and indexed a website, that website cannot possibly rank in the SERPs. Furthermore, if the organic ranking algorithms do not interpret the web page appropriately, then that web page may not rank at all for the targeted keywords.

In summary, website authors must regard the search engine optimisation guidelines (SEO elements) as priorities whilst designing and developing websites if search engines are to crawl, index and rank websites for targeted keywords.

One should not interpret WU as the usefulness of websites from personal angles. Instead, one should regard WU as a task-orientated function from anonymous, yet personal perspectives. It is important for visitors to know, exactly and intuitively, how to accomplish specific tasks on any particular website. Therefore, WU addresses the effectiveness, efficiency, learnability, memorability, error recovery and satisfaction of websites (Thurow & Musica 2009). Although artificial intelligence or algorithms do not govern it, WU has a number of attributes that one could apply to particular websites in order to improve interaction. All websites are different. However, the fundamental attributes remain the same in terms of how visitors interpret web pages. The actual design of websites is a crucial component of WU, and, by using graphic and textual signals, visitors should, at any point, be able to identify the current location, as well as the process involved in order to reach the desired destination whilst considering their objectives.

SEO is an essential component, as is WU, of successful websites. Therefore, SEO and WU will form the foundation for successful websites. However, spamdexing is also an essential component because developers must apply SEO so that search engines ideally never perceive business websites as spam. In addition, information-seeking platforms are changing. This will have an effect on WU if websites do not change to accommodate platform-specific devices appropriately (Nicholas et al. 2013). One can view successful SEO as the card to draw visitors to websites (via SERPs), whilst usability is the glue that keeps visitors interacting with websites. When combined correctly, the two will convert searchers to users to buyers – an action often called conversion.

Research method and design

The effect of SEO on websites depends largely on a simultaneous combination and deployment of all SEO elements. On the other hand, WU addresses the functional application of information to visitors.

The authors used a pre-test and post-test quantitative methodological design. They analysed the on-page SEO elements, which conflict with WU attributes, by using three websites that offer identical services. The authors analysed:

The control website (CW): The CW (which a business with minimal knowledge of SEO and WU created) has existed since 2006 (http://www.copywriters.co.za). It consists of 34 pages and 17 114 words of content. Because of the domain age, search engines had already successfully crawled and indexed the website. In addition, the website had generated a number of existing inlinks from several sources on the world wide web. However, for the purpose of this experiment, the authors excluded the referrer and direct traffic sources from the traffic-source data collection. One should note that the authors made no changes to the CW during the experiment.

The experimental website (EW): The authors created the EW with WU in mind. However, they deliberately ignored all SEO elements with minimal content (http://www.copywriters.co.za/ppc/). They isolated it from all forms of website traffic other than that PPC generated. They obtained user feedback by measuring the number of conversions the website obtained. One should also note that they made no changes to the EW during the experiment.

Experimental website 2 (EW2): The authors created EW2 with SEO in mind. However, they deliberately ignored all WU attributes. They launched the newly formed http://www.translation-copywriters.co.za website on 01 July 2010. It had no existing in-links, 29 web pages and 48 923 words of content. They also excluded referrer and direct traffic from the traffic-source data collection on this website. Organic and PPC traffic were the primary traffic sources. The first search engine crawled and indexed this website on 08 July 2010. During the next four months, the authors made systematic SEO changes to the website and recorded the primary experiment ranking measurements on 08 November 2010. These changes appear below.

» On 08 August 2010, the authors added 8077 words. They also optimised the metadata. This included headings and the anchor text in terms of the keywords.

» On 08 September 2010, the authors added 5362 words. They also optimised the metadata. This included headings and the anchor text in terms of the keywords (similar to the changes the authors made in the previous month).

» On 08 October 2010, the authors added 5726 words. They applied theming and internal linking to emphasise contextual phrasing and semantically related keywords.

» On 08 November 2010, the authors added 3080 words. They increased keyword density, frequency and proximity using the benchmarks they obtained for high-ranking competitors on the same-targeted keywords.

The primary experiment focused on monitoring 130 specific keywords and their rankings on the three major search engines (Google, Yahoo! and Bing). The intention was to compare the control website’s rankings with those that Experimental Website Two generated in order to determine whether applying SEO elements to a particular website had a direct effect on improved website ranking for targeted keywords. An extract of this keyword or phrase list follows.

Ad jingles; advertising and editorial writing; advertising copy; industry specific articles copywriting; informative media writing; internet copywriting; jingle writing; strategic business copywriting; strategic copywriting; technical editing; translate English into Afrikaans.

The authors conducted three additional experiments:

Organic traffic: The authors measured organic traffic for the CW and the EW2 daily on each domain. They grouped the results every month and interpreted them using a linear regression analysis. The objective was to determine whether the organic ranking improvements had a direct effect on increases in organic traffic.

Conversion (submitting a contact form): To establish the effectiveness of WU, the authors conducted conversion testing using a PPC campaign that they applied to all three websites for 49 days, with a budget of R3000.00 each. They decided to use the Kruskal-Wallis test to inspect the difference between variables. The specific variables the authors examined were the average time on site per visitor, the average page views per visit and the average number of conversions obtained per visitor. The objective was to determine whether the WU attributes, which contradict SEO elements, are essential to implement in order to improve website conversions.

Interviews: The authors conducted interviews with five randomly selected website users with a minimum of eight years of internet exposure in order to:
1. Identify WU attributes that they might have overlooked.
2. Consider human interaction compared to a focus on theory only.

Results

Through statistical analysis (using a univariate analysis of variance testing), the authors determined whether or not the web page to which a search was directed and the direction from which the search originated (global or local) has a significant effect on ranking positions. Although the results from the primary experiment, recorded on 08 November 2010, indicated that the search engine from which the search originated does not have a significant effect on rankings, the authors determined that, in all instances, the rankings for the EW2 were significantly better than those of the CW were. This showed that applying SEO elements to a particular website had a direct effect on improved website rankings for targeted keywords.

The additional experiments the authors conducted revealed:

Organic traffic: Using linear regression analysis, the authors determined that the coefficient for X showed that the traffic trend for the EW2 increased significantly over time. However, the coefficient for X on the Control Website indicated there was no significant trend for traffic over time. The authors determined that 51.4% of the variation in traffic to both websites was because of changes in time and alterations to the website. This showed that the organic ranking improvements had a direct effect on organic traffic increases.

Conversion: The results the authors gleaned from the conversion analysis of all three websites on the three main variables (average time on site per visitor, the average page views per visit and the average number of conversions obtained per visitor) showed that there was a significant statistical difference between the CW, EW and EW2. The EW yielded the highest number of conversions. This shows that applying WU attributes is essential to improving website conversions.

Interviews: The user feedback showed that WU attributes are important factors when visitors are deciding whether or not to interact with a website. The authors determined that participants perceived that some SEO elements were obstacles to WU. This confirmed the existence of SEO and WU contradictions. The participants agreed unanimously that the EW website was the one that provided the best user experience.

On the surface, it seems that SEO and WU complement each other (Visser & Weideman 2011b). However, one should note that SEO and WU contradict each other in some cases. Figure 3 presents the ’fused SEO and WU model’ on the left (marked A). On the right, marked B, Figure 3 presents the key to interpreting the model. The model consists of three main sections: SEO, WU and additional considerations. The connecting red lines illustrate the contradictions between SEO and WU. The bottom of the Figure illustrates the fused SEO and WU solution.

FIGURE 3: The fused search engine optimisation and website usability model (a) and the fused search engine optimisation and website usability model key and colour codes (b). (A scalable image of the model is available at http://www.eugene-visser.co.za)

Essentially, the contradictions between SEO and WU revolve around content, keywords and their presentation. Search engines are not human and, regardless of how advanced artificial intelligence may become, the probability of their completely simulating human behaviour is low. Essentially, search engine crawlers consider two components: the information actual web pages provide and other web pages that give opinions about the information on the web pages. The opinions of others could have an effect on the human decision-making process. However, reflecting on previous actions and human intuition defines the human decision-making process (Eisenberg et al. 2008; Pather & Remenyi 2005).

Furthermore, search engines also need to evaluate the competitive component for ranking priorities. This shows that it is important to understand what websites are about and whether website X will better satisfy visitors’ needs compared to website Y. In order to achieve this, websites must emphasise the information they provide to search engines. This would lead to an enormous amount of content and keyword or phrase semantic emphasis. Conversely, visitors are not interested in an overwhelming amount of content. This could entice visitors to leave websites out of frustration.

Other contradictions revolve around how search engines have attempted to simulate visitors’ interpretations of websites by evaluating their behaviour patterns. Search engines assume that, if a visitor visits a single web page on any particular website and leaves shortly afterwards, the web page did not satisfy the visitor’s needs. This are ‘bounces’ and have negative connotations. The contradiction exists with landing pages and conversion optimisation, where search engines optimise web pages to provide all the necessary information and functionality to satisfy visitors’ needs best given their objectives.

Ironically, industry experts do not acknowledge any contradictions. However, in both SEO (Thurow & Musica 2009) and WU (Nielsen & Loranger 2006:16), experts indicate that there are elements and attributes, which occur during website optimisation, that do not consider competitors. When scrutinising the SEO elements and WU attributes, which the industry experts define, the model reveals the contradictions (see Figure 3).

Developers could resolve the identified contradictions in the website architecture whilst developing websites. The solution uses SEO methodology, whereby developers categorise websites into themes, thus isolating each category within the website for emphasis (to address the competitive component).

Ideally, each category should consist of a number of content heavy web pages based on semantically targeted keywords or phrases (in order to address search engine phrase indexing). Therefore, the internal linking structure within each category needs to emphasise the primary category web page (landing page) by linking the actual semantically-related keywords or phrases in content web pages to the landing page (in order to address semantic-related keyword emphasis). Editorial judgement (inbound links) should target the appropriate semantic phrase content pages. This, in turn, will emphasise the landing pages. The high authority (good quality) websites relevant to the category should link directly to the landing page. Therefore, primary navigation should consist of the landing pages (which should be the actual product or service pages). Ideally, developers should optimise landing pages in order to optimise conversion. This will provide all the necessary information and functionality to allow visitors to convert on the landing pages.

Applying this architectural methodology (along with all other SEO elements) should address the SEO requirements of websites.

Discussion and conclusions

The SEO methodology prioritised the landing pages during search engine ranking. This indicates that visitors will arrive on the appropriate category web page, which should align with the visitors’ search queries (in order to address the visitors’ particular needs). Here, the primary navigation and using the breadcrumbs would already have addressed the WU in terms of current location and desired destination.

Optimising the landing pages for conversion will reduce unnecessary clicks and visitor frustration. Reducing web page content is a major challenge. However, developers can reduce content through the functionality of expendables. Developers should only apply this solution where appropriate and with caution. The functionality entices website interaction, allowing visitors to request additional information on the same web page without impeding interaction and/or visitor experience.

Although a few search engines use this technology, developers must implement it correctly to ensure that crawlers will crawl and index the content of entire web pages.

In conclusion, a listing in the top position on SERPs is not enough. In addition to the web page result description being enticing enough to convince searchers to click on the results (assuming that the search results align with the search queries), it is essential that searchers engage with websites in terms of their objectives (Thurow 2008; Visser & Weideman 2011b).

Finally, unforeseen contradictions may still surface during website development, even when one considers the identified SEO elements, WU attributes and the suggested website architecture.

The authors created the model in Figure 3 to provide a website optimisation guide for any business. It will address SEO and WU simultaneously. This model is the major contribution this research has produced to knowledge.

The reality is that each website is unique and its developer must optimise it with critical considerations of its business objectives.

Acknowledgements

Competing interests
The authors declare that they have no financial or personal relationship(s) that may have inappropriately influenced them when they wrote this article.

Authors’ contributions
E.B.V. (Purple Cow Communications) was the doctoral student, was responsible for the experimental and project design and producing the first draft of the manuscript. E.B.V. also effected ongoing changes and improvements and also created and managed the test and result websites. M.W. (Cape Peninsula University of Technology) initiated this research, was the only supervisor, contributed ideas on an ongoing basis to the design and experimentation, selected the journal, did the proofreading and layout and handled all editorial matters. Both authors made conceptual contributions to this research, participated in the writing and approval of this research project.

References

Adamo, S., 2013, ‘comScore releases August 2013 U.S. Search Engine Rankings’, in comScore, viewed 16 October 2013, from http://www.comscore.com/Insights/Press_Releases/2013/9/
comScore_Releases_August_2013_U.S._Search_Engine_Rankings

Arlitsch, K. & O’Brien, P.S., 2012, ‘Invisible institutional repositories: Addressing the low indexing ratios of IRs in Google Scholar’, Library Hi Tech 30(1), 60–81. http://dx.doi.org/10.1108/07378831211213210

Baeza-Yates, R. & Ribeiro-Neto, B., 1999, Modern information retrieval, Addison-Wesley, Harley.

Canavan, O., Henchion, M. & O’Reilly, S., 2007, ‘The use of the Internet as a marketing channel for Irish speciality food’, International Journal of Retail & Distribution Management 35(2), 178–195. http://dx.doi.org/10.1108/09590550710728110

Carpineto, C., Osiński, S., Romano, G. & Weiss, D., 2009, ‘A survey of web clustering engines’, ACM Computing Surveys 41(3), 17.1–17.38.

Cooper, L., 2013, ‘Aiming high’, Marketing Week 41(3), 39–41.

Cutts, M., 2007, ‘Undetectable spam’, in Matt Cutts: Gadgets, Google, and SEO, viewed 01 May 2013, from http://www.mattcutts.com/blog/undetectable-spam/

Cutts, M., 2010, ‘Google incorporating site speed in search rankings’, in Matt Cutts: Gadgets, Google, and SEO, viewed 01 February 2013, from http://www.mattcutts.com/blog/site-speed/

Cutts, M., 2011, ‘My thoughts on this week’s debate’, n Matt Cutts: Gadgets, Google, and SEO, viewed 26 January 2013, from http://www.mattcutts.com/blog/google-bing/

Eisenberg, B., Quarto-von Tivadar, J., Davis, L.T. & Crosby, B., 2008, Always be testing: the complete guide to Google website optimizer, Sybex, Indianapolis.

El-Gohary, H., 2010, ‘E-Marketing – A literature review from a small business perspective’, International Journal of Business and Social Science 1(1), 214–244.

Enge, E., Spencer, S., Fishkin, R. & Stricchiola, J.C., 2010, The art of SEO – Mastering search engine optimization, O’Reilly Media Inc., Sebastopol.

Google, 2009, Personalized search for everyone, viewed 25 August 2013, from http://googleblog.blogspot.com/2009/12/personalized-search-for-everyone.html

Google, 2011a, Frequently asked questions for the Google analytics data sharing options, viewed 29 November 2012, from http://www.google.com/support/analytics/bin/answer.py?answer=87515

Google, 2011b, Link schemes, viewed 27 January 2013, from http://www.google.com/support/webmasters/bin/answer.py?answer=66356

Google, 2011c, High-quality sites algorithm goes global, incorporates user feedback, viewed 25 August 2013, from http://googlewebmastercentral.blogspot.com/2011/04/high-quality-sites-algorithm-goes.html

Google, 2011d, More guidance on building high-quality sites, viewed 25 August 2013, from http://googlewebmastercentral.blogspot.com/2011/05/more-guidance-on-building-high-quality.html

Jerkovic, J.I., 2010, SEO warrior, O’Reilly Media Inc., Sebastopol.

Kűster, I. & Vila, N., 2011, ‘Successful SME web design through consumer focus groups’, International Journal of Quality & Reliability Management 28(2), 132–154. http://dx.doi.org/10.1108/02656711111101728

Layton, R.A., 2011, ‘Towards a theory of marketing systems’, European Journal of Marketing 45(1/2), 259–276. http://dx.doi.org/10.1108/03090561111095694

Moz, 2013, Google algorithm change history, viewed available 25 August 2013, from http://moz.com/google-algorithm-change

Murphy, H.C. & Kielgast, C.D., 2008, ‘Do small and medium-sized hotels exploit search engine marketing?’, International Journal of Contemporary Hospitality Management 20(1), 90–97. http://dx.doi.org/10.1108/09596110810848604

Nielsen, J. & Loranger, H., 2006, Prioritizing web usability, New Riders Press, Berkeley.

Nicholas, D., Clark, D., Rowlands, I. & Jamali, H.R., 2013, ‘Information on the Go: A case study of Europeana mobile users’, Journal of the American Society for Information Science and Technology 64(7), 1311–1322. http://dx.doi.org/10.1002/asi.22838

Oliveira, T. & Martins, M.F., 2010, ‘Understanding e-business adoption across industries in European countries’, Industrial Management & Data Systems 110(9), 1337–1354. http://dx.doi.org/10.1108/02635571011087428

Pather, S. & Remenyi, D., 2005, ‘Some of the philosophical issues underpinning research in information systems – From positivism to critical realism’, South African Computer Journal 35, 76–83.

Quinton, B.D., 2012, ‘Google’s Penguin update: Unhappy feat for marketers?’, Chief Marketer 3(10), 8.

Rayson, S., 2013, 10 Ways Google+ will improve your SEO, in socialmediatoday, viewed 25 August 2013, from http://socialmediatoday.com/node/1600736

Samujh, H., 2011, ‘Micro-businesses need support: survival precedes sustainability’, Corporate Governance 11(1), 15–28. http://dx.doi.org/10.1108/14720701111108817

Segal, D., 2011, ‘The dirty little secrets of search’, in The New York Times, viewed 02 February 2013, from http://www.nytimes.com/2011/02/13/business/13search.html

Sullivan, D., 2009, Google now personalizes everyone’s search result, in Search Engine Land, viewed 25 August 2013, from http://searchengineland.com/google-now-personalizes-everyones-search-results-31195

Sullivan, D., 2011, Why Google Panda is more a ranking factor than algorithm update, in Search Engine Land, viewed 25 August 2013, from http://searchengineland.com/why-google-panda-is-more-a-ranking-factor-than-algorithm-update-82564

Sullivan, D., 2012, Google Penguim update recovery tips & advice, in Search Engine Land, viewed 25 August 2013, from http://searchengineland.com/penguin-update-recovery-tips-advice-119650

Thurasamy, R., Mohamad, O., Omar, A. & Marimuthu, M., 2009, ‘Technology adoption among small and medium enterprises (SME’s): A research agenda’, World Academy of Science, Engineering and Technology 53, 943–946.

Thurow, S., 2003, Search engine visibility, New Riders Press, Indianapolis.

Thurow, S., 2008, Search engine visibility, 2nd edn., New Riders Press, Indianapolis. PMCid:PMC2613660

Thurow, S. & Musica, N., 2009, When search meets web usability, New Riders Press, Berkeley. PMCid:PMC2676734

Visser, E.B., 2007, ‘Search engine optimisation elements’ effect on website visibility: The Western Cape real estate SMME sector’, unpublished MTech thesis, Cape Peninsula University of Technology.

Visser, E.B. & Weideman, M., 2011a, An empirical study on website usability elements and how they affect search engine optimisation, SA Journal of Information Management 13(1), Art. #428, 9 pages. http://dx.doi.org/10.4102/sajim.v13i1.428

Visser, E.B. & Weideman, M., 2011b,’ Search engine optimisation versus Website usability – Conflicting requirements?’, Information Research 16(3), paper 493.

Weideman, M., 2009, Website visibility: The theory and practice of improving ranking, Chandos Publishers, Oxford. http://dx.doi.org/10.1533/9781780631790

Wu, B. & Davison, B.D., 2006, Detecting semantic cloaking on the Web, in Proceedings of the International World Wide Web Conference Committee (IW3C2), Edinburgh, Scotland, 23–26.

Yang, C., Yang, K. & Yuan, H., 2007, ‘Improving the search process through ontology-based adaptive semantic search’, The Electronic Library 25(2), 234–248. http://dx.doi.org/10.1108/02640470710741359