Abstract
Orientation: The majority of local governments in South Africa are underperforming; a first step to improve their performance is to accurately diagnose their current functioning. The utilisation of a mixed-methods approach for this diagnosis based on a valid model of organisational performance will form a better and holistic understanding of how a local government is performing.
Research purpose: The aim of this study is to investigate the utility of mixed-methods research as a diagnostic approach for determining the organisational performance of a local government in South Africa.
Motivation for the study: The use of either quantitative or qualitative data gathering in isolation as part of an organisational diagnosis can lead to biased information and not identifying the root causes of problems. The use of mixed-methods research in which both quantitative and qualitative data gathering methods are utilised has been shown to produce numerous benefits, such as confirmation of gathered data, providing richer detail and initiating new lines of thinking. Such multiple methodologies are recognised as an essential component of any organisational diagnosis and can be an effective means of eliminating biases in singular data gathering methods.
Research design, approach and method: A concurrent transformative mixed-methods strategy based on the Burke–Litwin model of organisational performance with triangulation of results and findings to determine convergence validity was used. A convenience sample of 116 (N = 203) permanent officials in a rural district municipality in South Africa completed a survey questionnaire and were also individually interviewed.
Main findings: Results indicate that mixed-methods research is a valid technique for establishing the integrity of survey data and for providing a better and holistic understanding of the functioning of an organisation. The results also indicate that the Burke–Litwin model is a useful and valid diagnostic framework for identifying the strengths and development areas of an organisation’s performance. Finally, the results established the reliability and validity of the survey instrument used for gathering data.
Practical and managerial implications: A mixed-methods research approach is a useful method to diagnose organisations’ performance to ensure data integrity and to obtain a comprehensive picture of an organisation’s performance. A further practical implication is that managers and practitioners can use the Burke–Litwin model as a basis for diagnosing the performance of an organisation with confidence, as it identifies the most important aspects of an organisation’s functioning.
Contribution and value add: Organisational diagnoses are usually conducted by either quantitative or qualitative means, while the use of mixed-methods research is a relatively underutilised approach. This study aims to contribute to the availability of research approaches for diagnosing the performance of organisations.
Introduction
For the public sector, improving organisational effectiveness has become just as important as for the private sector (Boyne, James, John & Petrovsky, 2009). Citizens all over the world are increasingly demanding that public sector organisations improve their service delivery and prove that they have an impact on complex social problems, while tax payers are demanding an acceptable return on the taxes that they pay to governments at all levels (Sowa, Seldon & Sandfort, 2004). Since the 1980s, public sectors around the world have emphasised administrative reform to improve effectiveness and efficiency (Van Thiel & Leeuw, 2002). This is also true for local government level, as municipalities are creatures of stature – they exist to do things that parliament or a country’s constitution has said that they should do (Jackson, 1984). As such, they are under pressure to perform, as a failure to do so will see them either being punished or rewarded by voter choice (Boyne et al., 2009).
In South Africa the large number of service delivery protests at local government level over the last 8 years is a clear indication that ordinary citizens are demanding that municipalities become more effective and start delivering on their constitutional mandate (Municipal IQ Hotspots Monitor, 2016). Demands for interventions to improve the functioning of municipalities are also increasing, and the South African Government had to launch an ambitious Local Government Turn-Around Strategy (LGTAS) in an attempt to improve local government effectiveness (Department of Cooperative Governance and Traditional Affairs [DCOGTA] 2009).
Hall (1999) states that the first step towards improving organisational effectiveness is to determine how it is currently functioning, and to do this, an organisational diagnosis is necessary. This is supported by Saeed and Wang (2013), who state that managers should be capable of speedily identifying the weaknesses of their organisations to stay competitive. Harrison (2005) states that in organisational diagnosis, conceptual models and applied research methods are used to assess an organisation’s current state and determine how to enhance performance, as without careful diagnosis, decision makers may waste effort by failing to attack the root causes of problems. Diagnosing organisations also form part of the action research approach to organisational development (OD), which is according to Cummings and Worley (2015) a process for understanding, developing and changing organisations and improving their health, effectiveness and self-renewal capabilities. It is thus clear that the assessment of organisational effectiveness is essential to improve it.
The use of mixed-methods research in which quantitative and qualitative data gathering methods are utilised has been shown to produce numerous benefits, such as confirmation of gathered data, providing richer detail and initiating new lines of thinking (Rossman & Wilson, 1991). Such multiple methodologies is recognised as an essential component of any organisational diagnosis (Paul, 1996) and can be an effective means of eliminating biases in quantitative and qualitative data gathering methods (Creswell, 2014). Organisational models, in turn, are useful for enhancing our understanding of organisational behaviour, helping us to categorise and interpret data about an organisation and providing a common, short-hand language (Howard, 1994). The Burke–Litwin model of organisational performance has been shown to provide a valid framework for describing the relationships between different aspects of an organisation (Burke & Litwin, 1992; Cummings & Worley, 2015) and has been used with success to measure the performance of various diverse organisations in the private sector (Martins & Coetzee, 2009).
The action research approach to OD and an organisational diagnosis as an important step in this process is well established (Cummings & Worley, 2015; Falletta, 2005; French, Bell & Zawacki, 1978; Jones & Brazzel, 2006; Martins & Geldenhuys, 2016; Saeed & Wang, 2013). Similarly, authors have reported that the Burke–Litwin model of organisational performance is a valid framework for diagnosing an organisation (Burke & Litwin, 1992; Cummings & Worley, 2015; Martins & Coetzee, 2009). Numerous studies have also utilised a quantitative (Harrison, 2005; Smit, 1999; Wiley, 2010) or qualitative (Martins & Coetzee, 2009; Thomas, 2012; Tsuchiya, 2016) approach to diagnose the current functioning of an organisation with excellent results. However, the use of mixed-methods research as a diagnostic approach based on the Burke–Litwin model is largely non-existent. This study aims to contribute to the theoretical and empirical body of knowledge availability on utilising a mixed-methods approach based on the Burke–Litwin model when conducting an organisational diagnosis and will provide both researchers and practitioners with alternative diagnostic strategies to utilise as part of the action research approach to OD.
Research objective
The aim of the current study is to investigate the utility of mixed-methods research as a diagnostic approach for determining the organisational performance of a local government in South Africa.
What will follow
The article starts with a review of relevant literature regarding local government in South Africa, mixed-methods research and the use of the Burke–Litwin model in assessing the performance of organisations. The literature review is followed by the research methodology after which the results and findings of the study are discussed. The article then ends with limitations, recommendations for future research, implications for management and conclusions.
Literature review
Importance of local government in South Africa
According to Mitlin (2000), local government is a very important sphere of government all over the world because it is the sphere of government that is the closest to the people, as many basic services are delivered by municipalities, and local ward councillors are the politicians closest to communities. Furthermore, local governments are increasingly required to play larger roles in providing services, alleviating poverty and facilitating development (Andrews & Shah, 2003; Cameron, 2005; Craythorne, 2006; Mortimer, 2004; Rakodi, 1997).
In South Africa, after the first democratic municipal elections in 1994, there were high expectations of local governments regarding service delivery (Mortimer, 2004). This is confirmed by the fact that the objectives for local government are set out in section 152 of the South African Constitution (Constitution of the Republic of South Africa, 1996). In accordance with the Local Government: Municipal Structures Act (1998) and to fulfil its constitutional obligations, South Africa is divided into 283 municipalities, based on three legal categories, namely Metropolitan Municipalities (8), District Municipalities (44) and Local Municipalities (231). However, a central challenge for the many new institutions of local government in South Africa has been their viability and ability to build strong organisations capable of delivering on the principles of the Constitution (Atkinson, 2007).
The effectiveness of local government in South Africa
The primary responsibility of local governments in South Africa today is to provide access to crucial public services (Constitution of the Republic of South Africa, 1996). Given their constitutional responsibilities, how are local governments in South Africa currently performing? According to the DCOGTA (2009), which conducted its own investigation into the functioning of all municipalities in South Africa, the local sphere of government is fraught with community frustration over poor institutionalisation of systems, poor service delivery and poor political governance. The State of Local Government in South Africa Report (DCOGTA, 2009) further states that a culture of patronage and nepotism is now so widespread in many municipalities that the formal municipal accountability system is ineffective and inaccessible to many citizens. This is supported by Koma (2010, p. 122), who states that ‘The performance of numerous municipalities across the country has thus far clearly demonstrated huge deficiencies in as far the fulfilment of both their constitutional and legislative obligations are concerned’. A report by the Department of Planning, Monitoring and Evaluation (2014) states that local governments face challenges of poor governance, lack of accountability, a shortage of technical and professional skills, the inability to collect revenue from citizens and weak service delivery. These challenges were confirmed by a report by the South African Institute of Race Relations (2014) in which 80 local government indicators derived from census data were used to assess the performance of all municipalities in South Africa. Finally, in the DCOGTA Annual Report for 2014/2015 (DCOGTA, 2016, p. 27), it is stated that ‘In 2014, a third of municipalities are performing their functions adequately, a third is at risk as they are relatively functional, and the bottom third are failing people dramatically and require urgent intervention to correct the decay in the system’.
The inability of local governments to deliver services as described above has been publicly evidenced in the spate of community protests in South Africa since 2009, which may be seen as a symptom of the alienation of citizens from local government. According to Municipal IQ Hotspots Monitor (2016), there have been 983 major service delivery protests staged against local governments over the period 2009–2015, occurring roughly at a rate of a protest every second day, which is illustrated in Figure 1. These protests indicate an escalating loss of confidence in local government service delivery in South Africa over the last 7 years covered by the Municipal IQ Hotspots Monitor report.
|
FIGURE 1: Major service delivery protests in South Africa over the period 2009–2015. |
|
According to Atkinson (2007), Schmidt (2010), Mbecke (2014) and Mutiro and Fore (2015), the main reasons for the mass protests over the last few years are (1) municipal ineffectiveness in service delivery because of a lack of leadership, corporate governance and widespread corruption; (2) the poor responsiveness of municipalities to citizens’ grievances; and (3) the conspicuous consumption entailed by a culture of self-enrichment on the part of municipal councillors and staff. Adding to this, Leibbrandt and Botha (2014) state that the inability to execute strategies is one of the main problems in local governments in South Africa today. It is thus obvious that the majority of local governments in South Africa are currently underperforming, deemed to be ineffective and in crisis.
Importance of assessing the organisational performance of local government
To improve the effectiveness of municipalities in South Africa, and thus the lives of all its citizens, it will firstly be necessary to measure current performance. This will ensure that the correct aspects are addressed to improve effectiveness. In fact, this diagnostic approach to improving organisational effectiveness is applicable to any kind of organisation (Harrison, 2005).
Falletta (2005, p. 3), states that ‘organisational diagnosis involves diagnosing or assessing an organisation’s current level of functioning in order to design appropriate change interventions’. Lee and Brower (2006) support this view and state that the practical use of assessing organisational effectiveness stems from the intent to analyse the present state of an organisation and to improve performance of the organisation in accordance with diagnostic findings. According to Immordino (2010), to remain effective, organisations of all kinds must continually improve themselves in response to the challenges that they face; those in the public sector are no exception. This is because all levels of governments are under constant pressure to improve their efficiency, effectiveness and responsiveness (Boyne et al., 2009). It is thus clear that a diagnosis of a municipality is an important first step towards its effectiveness.
Mixed-methods research
According to Johnson, Onwuegbuzie and Turner (2007, p. 112), ‘Mixed methods research is becoming increasingly articulated, attached to research practice, and recognized as the third major research approach or research paradigm, along with qualitative research and quantitative research’. Despite this, Leech and Onwuegbuzie (2009, p. 265) state that ‘The mixed method paradigm is still in its adolescence, and, thus, is still relatively unknown and confusing to many researchers’. In fact, Johnson et al. (2007) identified and discussed 19 different definitions of mixed-methods research from the literature.
According to Johnson et al. (2007), since the 1960s various other terms have been used to refer to mixed-methods research, including multiple operationalism, triangulation, blended research, integrative research, multiple methods and mixed research. Although triangulation became a popular way of referring to mixed-methods research in the literature, it would seem that currently, triangulation is only one component of such a design, explaining the mixing of data once it has been gathered (Torrance, 2012). This view is supported by recent authors who all tend to use the term mixed-methods research (Bryman, 2007; Creswell, 2009, 2014; Johnson et al., 2007; Tashakkori & Teddlie, 2010). Creswell (2014) provides a recent explanation of mixed-methods research, which contains several core characteristics, including that it involves the collection of both qualitative and quantitative data; that it includes the analysis of both forms of data; that the two forms of data are integrated in the design analysis through merging the data, connecting the data, or embedding the data; and that these procedures are incorporated into a mixed-methods design that includes the timing of data collection as well as the emphasis for each database.
Various authors have set out the advantages of a mixed-methods approach, including participant enrichment, instrument integrity, treatment integrity, significance enhancement (Johnson et al., 2007), drawing on both qualitative and quantitative research and minimising the limitations of both approaches, and providing a more complete understanding of research questions (Creswell, 2014). These advantages have prompted researchers to use mixed-methods research in various diverse setting over the last 20 years, including to diagnose a pharmacy department in a non-profit research hospital (Paul, 1996), to assess organisational culture (Yauch & Steudel, 2003), to study relocation decision making and couple relationships (Challiol & Mignonac, 2005), to study adolescent alcohol use (Newman, Shell, Ming, Jianping & Maas, 2006), to study social support in widowhood (Scott et al., 2007), to evaluate a school principal development programme (Youngs & Piggot-Irvin, 2012) and to study antecedents of servant leadership (Beck, 2014). However, there is a lack of studies utilising a mixed-methods research approach to diagnose the organisational performance of organisations in general and local governments in particular.
Models of organisational performance
According to Lusthaus, Adrien, Anderson, Carden and Montalván (2002), an organisational model is a framework for analysing the strengths and weaknesses of an organisation in relation to its performance. This view is supported by Immordino (2010), who states that using a model is a structured method of collecting and evaluating information about those areas of an organisation’s operations that are most closely associated with organisational excellence.
Falletta (2005) argues that an organisational model is a representation of an organisation that helps us to understand more clearly and quickly what we are observing in organisations and that it provides a systematic way to collect data on the organisation and to understand and categorise the data. Cummings and Worley (2015) support this view, explaining that organisational models are conceptual frameworks that practitioners use to understand organisations. Both Falletta (2005) and Wiley (2010) state that models often identify vital organisational variables, which are hypothesised to exist based on prior research and depict the nature of the relationships between these key variables. Harrison (2005) argues that models are important in organisational assessment, as they help to choose which data to attend to and which data to ignore, to determine what kinds of analysis should be applied to the data and to interpret the meaning of the output of those analyses.
The Burke–Litwin model of organisational performance
To assess the performance of a municipality in South Africa, it was essential to choose an organisational model that complied with certain diagnostic requirements. Martins and Coetzee (2009) mention that such a model must be well researched, while Olivier (2014) states that it should address as many of the elements of an organisation’s functioning as possible. Although Jones and Brazzel (2006) state that there is no one best diagnostic model to use, the Burke–Litwin model (Burke & Litwin, 1992) was considered by the researcher to comply with all the stated requirements.
The Burke–Litwin model of organisational performance is founded on a functional cause-and-effect framework and explains how linkages between 11 elements contribute towards organisational performance, which is the 12th element (Burke & Litwin, 1992). Burke and Litwin (1992) describe organisational performance as the outcome of work performance, effort and achievement, and indicators of this would include productivity, customer satisfaction and service quality. According to Jones and Brazzel (2006) and Martins and Coetzee (2009), the Burke–Litwin model highlights two distinct sets of organisational dynamics. One set is primarily associated with the transactional level of human behaviour or the everyday interactions and exchanges that create the climate of the organisation, while the second set of dynamics is concerned with processes of human transformation, amounting to sudden ‘leaps’ in behaviour. See Figure 2 for a visual exposition of the model.
|
FIGURE 2: The causal model of organisational performance. |
|
Transformational factors affecting organisational performance
Burke and Litwin (1992) state that transformational variables refer to areas in which alteration is caused by interaction with environmental forces (both within and without) and which require entirely new sets of behaviour on the part of organisational members. According to Burke and Litwin (1992), the following four transformational factors affect organisational performance:
External environment: Any outside condition or situation that influences the performance of the organisation. These conditions include marketplaces, world financial conditions and political and governmental circumstances.
Mission and strategy: What employees believe is the central purpose of the organisation and how the organisation intends to achieve that purpose over time.
Leadership: Executive behaviour that encourages others to take necessary actions.
Culture: The collection of overt and covert rules, values and principles that guide organisational behaviour and that have been strongly influenced by history, custom and practice.
Transactional factors affecting organisational performance
According to Burke and Litwin (1992), the following seven transactional variables, referring to interactions that take place primarily via relatively short-term reciprocity among people and groups, affect organisational performance:
Structure: The arrangement of functions and people into specific areas and levels of responsibility, decision making authority and relationships.
Management practices: What managers do in the normal course of events to use human and material resources to carry out the organisation’s strategy.
Systems: Standardised policies and mechanisms that facilitate work. Systems primarily manifest themselves in the organisation’s reward systems and in control systems such as goal and budget development and human resource allocation.
Work group climate: The collective current impressions, expectations and feelings of the members of local work units.
Skills or job match: The behaviour required for task effectiveness, including specific skills and knowledge required for people to accomplish the work assigned and for which they feel directly responsible.
Individual needs and values: The specific psychological factors that provide desire and worth for individual actions or thoughts.
Motivation: Aroused behavioural tendencies to move towards goals, to take necessary action and to persist until satisfaction has been attained.
The Burke–Litwin model of organisational performance has been shown to provide a valid framework for describing the relationships between different aspects of an organisation (Burke & Litwin, 1992; Cummings & Worley, 2015) and has been used with success to measure the performance of various diverse organisations (Martins & Coetzee, 2009; Olivier, 2014).
From the literature review, the following mixed-methods research question was formulated:
Do the quantitative results and qualitative findings of an organisational diagnosis based on the Burke–Litwin model of organisational performance show convergent validity as to how a local government is performing?
Research design
Research approach and strategy
A concurrent transformative mixed-methods strategy using triangulation to determine convergence validity was used (Creswell, 2009; McFee, 1992). This strategy entails the concurrent collection of both quantitative and qualitative data (Creswell, 2014), guided by the researcher’s use of a specific theoretical perspective (Creswell, 2009), the separate analysis of the quantitative and qualitative data gathered (Braun & Clarke, 2013; Creswell, 2014) and triangulation of data to determine convergence validity of the data gathered (Creswell, 2009; McFee, 1992). This strategy is depicted in Figure 3. Quantitative data were gathered by means of a survey instrument while qualitative data were gathered by means of interviews, with the Burke–Litwin model of organisational performance as the theoretical perspective on which the gathering of both types of data were based (Burke & Litwin, 1992).
|
FIGURE 3: A visual exposition of the concurrent transformative strategy with triangulation. |
|
The reasons for using this strategy was firstly to form a better and holistic understanding of how the local government in the study was performing by using both quantitative and qualitative performance data, which Collins, Onwuegbuzie and Sutton (2006, p. 116) refer to as ‘significance enhancement’. Secondly, it was used to establish the content validity of the questionnaire used by means of triangulation, comparing the interview data to the questionnaire data, which Collins et al. (2006, p. 116) refer to as ‘instrument fidelity’, which was necessary as the content validity of the survey instrument used had not yet been established for a local government, and thirdly, to use a validated organisational performance model to guide the quantitative and qualitative data gathering such as that propagated by Burke and Litwin (1992), which addresses all the elements of an organisation’s functioning (Martins & Coetzee, 2009; Olivier, 2014).
The research followed an organisational diagnostic approach. Organisational diagnosis is seen as central to the practice of OD, being a data gathering method of determining a valid and accurate picture of an organisation’s current functioning (Brown, 2011; Cummings & Worley, 2015; French et al., 1978). According to Cummings and Worley (2015), diagnosis is the process of understanding how an organisation is currently functioning and providing the information necessary to design improvement interventions. For this reason, various authors have over the years emphasised the importance of assessing an organisation to understand it and improve its functioning (Brown, 2011; Cummings & Worley, 2015; Ghorpade, 1971; Hall, 1999; Harrison, 2005; Lusthaus et al., 2002).
Research method
Research setting
As stated previously, South Africa is divided into 283 municipalities, based on three legal categories, namely 8 densely populated metropolitan municipalities, 44 district municipalities and 231 local municipalities (Local Government: Municipal Structures Act, 1998). This research was conducted in one of the 44 district municipalities located in a small rural or lightly settled town, which has five local municipalities within its area of jurisdiction. Its main function is to coordinate the delivery of municipal services to inhabitants between the five local municipalities. The district municipality consisted of 203 permanent officials who fulfilled the various line and support functions in the municipality.
Entrée and establishing researcher roles
The primary researcher was introduced to the management team of the municipality in this study by their internal OD consultant, who attended an OD programme presented by the researcher as a lecturer at the University of South Africa. The primary researcher was subsequently contracted by the district municipality (client) to conduct an organisational diagnosis to determine the current functioning of the municipality and to compile a suggested intervention plan to rectify any identified development areas. Three experienced OD consultants who were all thoroughly trained on the Burke–Litwin model and on interviewing techniques also formed part of the research team. OD consultants rather than once-off interviewers were used to conduct the interviews as they would also be involved in providing feedback to the municipality and in implementing interventions to address the identified developmental areas. The primary researcher subsequently contracted the terms of reference for the diagnosis with the management team, while the municipality’s internal OD consultant was used extensively to gain access to the management team and personnel within the municipality and to set up the necessary appointments and assist with logistical arrangements.
Research participants and sampling
The population for this study was all the permanent official of a rural district municipality in South Africa, namely 203. The sampling method utilised was convenience sampling, a non-probability sampling method (Babbie, 2010), which is according to Onwuegbuzie and Collins (2007) one of 24 sampling methods available and suitable for mixed-methods research. This method was used as all permanent officials of the district municipality were invited to voluntarily take part in the data gathering, which was explained as consisting of a self-completion questionnaire and one-on-one individual interviews. A sample of 116 (57%) officials was willing to participate in the study, which is nearly double the 30% which Salkind (2012) considers to be a good representation of the population for a quantitative survey of this nature. The same sample was used for the qualitative data gathering phase, which Onwuegbuzie and Collins (2007) refer to as an identical sample relationship and which they state is appropriate for a concurrent mixed-methods research design. The aim of such a design, as used in this study, is to compare both forms of data to search for congruent findings, that is, how the statistical results in the quantitative analysis compare with the themes identified in the qualitative data collection (Creswell, 2009). The composition of the sample is indicated in Table 1.
TABLE 1: Composition of the research sample (n = 116). |
As can be seen from Table 1, the majority of the sample were men (66.4%), supervisors or managers (45.7%), had a tertiary education (68.2%) and had less than 5 years of experience at the municipality (78.4%).
Measuring instrument: Quantitative data collection
Quantitative data were gathered by means of the Organisational Performance Questionnaire (OPQ), a survey instrument developed specifically for the district municipality. Firstly, a study was made of the questions used in the 2002 Langley Research Center (LaRC) Organizational Performance Survey. This survey, which consisted of 134 questions, was compiled to measure the 12 elements of the Burke–Litwin model (Burke & Litwin, 1992) and was constructed between 2000 and 2002 in a joint effort between LaRC and IBM (at the time PriceWaterhouseCoopers LLC) [IBM Business Consulting Services, 2003]. Secondly, the questions used in the LaRC survey were then adapted to local government terminology and refined based on the researcher’s knowledge of local government and the Burke–Litwin model. This resulted in a draft survey instrument, which the researcher named the OPQ, consisting of 96 items. Thirdly, a pilot study was conducted with the draft version of the OPQ to test its face validity, according to the guidelines provided by Collingridge (2014). This was done by requesting a group of 15 managers representing all the different departments in the district municipality to complete the OPQ and to specifically comment on four aspects, namely did they understand the instructions given to complete the questionnaire, was the correct biographical information requested from respondents, was the terminology used appropriate for the district municipality and did they understand the questions posed. The managers in the pilot study were satisfied with the draft survey instrument and no changes were suggested and this survey instrument was then accepted as the final OPQ, which consisted of a biographical section and 96 questions assessing each of the 12 Burke–Litwin elements.
Respondents who completed the OPQ were required to rate each of the survey questions on a four-point Likert scale (Likert, 1932), where 1 was Strongly Disagree, 2 was Disagree, 3 was Agree and 4 was Strongly Agree. Undecided was allocated a score of nought and thus did not influence calculations. Each of the 12 elements was measured separately and produced a separate score for the element, reflecting the respondents’ evaluation of the elements.
It was decided with the management team of the district municipality before the data gathering process commenced what interpretations would be used with the OPQ results. The discussions were guided by the cut-off scores recommended by the Human Science Research Council of South Africa, as reported by Castro and Martins (2010), in which a score of acceptable performance of 3.2 (64%) was recommended on a five-point Likert scale. On a four-point scale as was used in this study, 64% converted to 2.56 or less, but it was decided to increase this cut-off score of acceptable performance to a 3 out of 4, or 75%. The following calculations were used to decide on the following score ranges for this study:
Development area: A total average score for a Burke–Litwin element of 1.0–2.9 (74% or lower).
Acceptable: Maintain: A total average score for a Burke–Litwin element of 3.0–4.0 (75–100%).
Data collection method and recording: Qualitative data
Semi-structured individual interviews were conducted with each member of the same sample who completed the OPQ. During the interviews, each participant was shown a diagram of the Burke–Litwin model and asked the following question which was formulated by the researcher:
‘Looking at the Burke–Litwin model, how would you rate the performance of your municipality on each of the 12 elements?’ That is, is an element according to you acceptable and must be maintained, or do you consider it to be a development area?
If a respondent was unsure about what the element entailed, the researcher would shortly explain it to him or her. The interviewers also probed the answers given by each respondent until clarity was reached on each element regarding a rating. Interviewers then asked each respondent for specific reasons as to why they considered an element to be a developmental area, if it was so identified. These ratings and reasons were then noted by each interviewer in writing for each respondent for each element. Because of the large number of interviews that had to be conducted by each of the four researchers on each day, it was not feasible in terms of time and resources to record and transcribe each interview. Furthermore, interviewers did not have to record everything that was said by an interviewee after probing questions were asked, but only his or her rating of an element as acceptable or developmental, and the reasons if developmental was identified.
Research procedure
After contracting with the management team of the district municipality, the actual data were gathered by inviting all permanent officials in the municipality to participate in the process. The municipality’s internal methods, namely emails and staff meetings, were used to share information on the planned data gathering with all applicable staff members. Communication of the data gathering process included a message from the municipal manager explaining the purpose of the survey, the issue of confidentiality and anonymity, what the results would be used for and the actions the municipality would be able to take based on the information gathered.
A specific day was identified and scheduled for initial data gathering and planning and communicated to all permanent officials in the municipality. A total of 116 permanent officials were willing to participate in the study and arrived on the specified day at a predetermined venue. After confirming the purpose of the data gathering exercise, officials were requested to complete and sign an informed consent form in which they undertook to voluntarily complete a survey instrument and be exposed to individual one-on-one interviews. Thereafter, all 116 officials completed the OPQ in one venue at the same time under supervision of the main researcher. This was followed by scheduling 1-h semi-structured individual interviews with these same 116 officials over the next 5 days as follows:
- Day 1: Five interviews conducted by four interviewers = 20 interviews
- Day 2: Seven interviews conducted by four interviewers = 28 interviews
- Day 3: Seven interviews conducted by four interviewers = 28 interviews
- Day 4: Seven interviews conducted by four interviewers = 28 interviews
- Day 5: Three interviews conducted by four interviewers = 12 interviews
- Total: 116 interviews
Statistical analysis of quantitative data
Quantitative data gathered by means of the OPQ were analysed by using SPSS version 23 (IBM, 2015). Cronbach’s alphas were calculated to measure the internal consistency of the OPQ while means, standard deviations and average scores for each of the Burke–Litwin elements were calculated to produce scores for the entire municipality. As the district municipality’s management team was only interested in overall scores for each Burke–Litwin element, differences in scores between the different biographical groupings were not calculated.
Qualitative data analysis
The qualitative data that were gathered by means of the semi-structured individual interviews were analysed in two ways. Firstly, responses were measured in terms of response rates (Vitale, Armenakis & Field, 2008). This was done by summating the ratings (acceptable or developmental area) given by each respondent to each element for a total for each element. If 75% (87) or more of the respondents, the previously determined cut-off point, indicated that an element was acceptable and should be maintained, it was noted as such. If less than 75% indicated that an element was a development area, it was also noted as such. Secondly, the specific reasons given as to why respondents considered an element to be a developmental area were analysed by means of thematic analysis, defined by Braun and Clarke (2013, p. 178) as ‘a method for identifying themes (or patterns of meaning) across a data set in relation to a research question’. This analysis followed the seven stages propagated by Braun and Clarke (2013), namely: (1) transcription, (2) reading familiarisation, (3) coding, (4) searching for themes, (5) reviewing themes, (6) defining and naming themes and (7) writing the report.
Strategies employed to ensure quality of data
Strategies employed to ensure quality of quantitative data
The first strategy employed to ensure the quality of the quantitative data was to use a 57% sample for the study, which is nearly double the 30% that Salkind (2012) considered to be a good representation of the population for a diagnostic survey. Secondly, the development of the survey instrument used in the study (OPQ) followed a rigorous development protocol prescribed by Borg and Mastrangelo (2008), Collingridge (2014) and Wiley (2010). Thirdly, the face validity of the OPQ was established by means of a pilot study (Collingridge, 2014; Wiley, 2010). Fourthly, respondents completed the OPQ in one session under supervision so that any uncertainties could be clarified immediately, such as not understanding the meaning of a specific question. Lastly, the content validity of the OPQ was established by means of triangulation, which indicated that the results of the questionnaire and the results of the individual interviews showed convergence validity (Creswell, 2009; McFee, 1992).
Strategies employed to ensure quality of qualitative data
The quality of the qualitative data was ensured by focussing on trustworthiness using the four criteria and applicable suggested provisions for each criteria as developed by Guba (as cited in Elo et al., 2014; Shenton, 2004). Credibility was ensured by adopting well established research methods (interviews; thematic data analysis), developing an early familiarity with the culture of the municipality (preliminary visits to the municipality; studying various appropriate documents prior to the data gathering), ensuring honesty of interviewees (giving them the opportunity to refuse to participate in the interviews) and by utilising only highly qualified and experienced researchers (all researchers were registered industrial psychologists and experienced OD consultants). Transferability was ensured by providing the reader with sufficient contextual information about the study (type, size and location of municipality; number of participants; data collection methods; number and duration of interviews; time period over which data were collected) to decide whether the findings could be transferred to similar situations. Dependability was ensured through the thorough scientific planning and execution of the study and the in-depth description of the research design and procedure that was followed. Lastly, confirmability or objectivity was ensured by using a team of four researchers to plan, collect and analyse the qualitative data in order to reduce the effect of single-researcher bias that could have influenced the study.
Strategies employed to ensure quality of mixed-methods data
Mixed-methods validity, referred to as data legitimation by Leech, Onwuegebuzie and Combs (2011) and Onwuegbuzie and Johnson (2006), was ensured by utilising four of the nine legitimation types propagated by Onwuegbuzie and Johnson (2006). Sample integration legitimation was ensured by using a large and identical sample (n = 116 = 57%) for both the OPQ and the interviews. This enabled the combination of inferences that emerged from both approaches to construct what Tashakkori and Teddlie (2003, p. 687) refer to as ‘meta-inferences’. Weakness minimisation legitimation was ensured by compensating for the weakness in the quantitative approach (respondents not fully understanding the questions in the instruments) by the strength of the qualitative approach (interviewers could probe responses to ensure clarity and comprehension of questions asked). Conversion legitimation was applied by counting the number of times an interviewee rated an element in the Burke–Litwin model as either acceptable or a development area and comparing these ratings with the scores obtained for each element on the OPQ. This comparison confirmed convergent validity as the ratings and scores were similar. Multiple validities legitimation was achieved because of the fact that appropriate strategies to ensure quantitative, qualitative and mixed-methods data quality were applied and achieved, as described in the previous sections.
Ethical considerations
The research was conducted within an ethical framework as suggested by Creswell (2014). This included obtaining informed consent from all participants, protection of their identities and allowing them to withdraw from the study at any time. Approval to use the data for research purposes was obtained from the municipal manager of the municipality, while ethical clearance to conduct the study was obtained from the researcher’s university.
Results: Quantitative data
Reliability of the Organisational Performance Questionnaire
Table 2 indicates the reliability statistics that were obtained for the OPQ, which measured the 12 elements of the Burke–Litwin model. An overall reliability coefficient of 0.76 was obtained, while a low of 0.63 for the elements External Environment and Individual Needs and Values and a high of 0.91 for Management Practices was obtained. These reliability statistics indicate that the OPQ can be considered as a reliable instrument, as Babbie (2010) states that coefficients that are 0.3 and higher are considered acceptable for the behavioural sciences.
TABLE 2: Reliability statistics for the 12 Burke–Litwin elements measured by the Organisational Performance Questionnaire. |
Average scores for the Burke–Litwin elements
Figure 4 indicates the average scores obtained on the 12 Burke–Litwin elements measured by the OPQ for the municipality, while Table 3 sets out the interpretation of the obtained scores as previously contracted with the municipality management team. The results indicate that four of the elements obtained a score of 3.0 and above and were considered as acceptable and should be maintained (External Environment, Mission and Strategy, Culture and Individual Needs and Values), while eight of the elements obtained a score of below 3.0 and were subsequently considered to be development areas.
|
FIGURE 4: Organisational Performance Questionnaire average scores for each Burke–Litwin element (n = 116). |
|
TABLE 3: Interpretation of Organisation Performance Questionnaire scores per element (n = 116). |
The results indicate that in general, the transformational aspects of the Burke–Litwin model were considered to be acceptable, that is, interaction with environmental forces (both within and without) and which require entirely new sets of behaviour on the part of organisational members (Burke & Litwin, 1992). The results also indicate that most of the development areas were situated in transactional aspects, referring to interactions that take place primarily via relatively short-term reciprocity among people and groups and which affect organisational performance (Burke & Litwin, 1992).
Findings: Qualitative data
The findings of the individual interviews, shown in Table 4, indicated that four of the elements were considered by participants to be acceptable and should be maintained (External Environment, Mission and strategy, Culture and Individual Needs and Values), while participants considered eight of the elements to be development areas. Table 4 also indicates the final themes that were identified from the thematic analysis of the reasons given by respondents as to why an element was considered as a development area.
TABLE 4: Summary of the ratings by respondents for each Burke and Litwin element during the individual interviews and final themes identified from the interviews (n = 116). |
Comparison (Triangulation) of quantitative and qualitative data
In order to determine whether there was convergent validity between the quantitative results and qualitative findings (Creswell, 2009; McFee, 1992), the scores obtained from the OPQ were compared to the verbal ratings, which were given to each of the 12 Burke–Litwin elements by the participants. This comparison (triangulation) produced the results indicated in Table 5.
TABLE 5: Triangulation of quantitative and qualitative data. |
The results of the comparison indicated that the same four elements that obtained a score of 3.0 or above on the OPQ were considered by participants in the interviews to be acceptable and should be maintained (External Environment, Mission and strategy, Culture and Individual Needs and Values), while the same eight elements that obtained a score of below 3.0 on the OPQ were considered by participants in the interviews to be development areas.
Discussion
This study investigated the utility of mixed-methods research as a diagnostic approach for determining the organisational performance of a local government in South Africa. The theoretical perspective on which the transformative mixed-methods approach was based was the Burke–Litwin model of organisational performance. This study is considered to be important as organisational diagnoses are usually conducted by either quantitative or qualitative means, while the use of mixed-methods research is a relatively underutilised approach, which could contribute to the availability of research approaches for diagnosing the performance of organisations.
The mixed-methods research question posed was whether the quantitative results and qualitative findings of an organisational diagnosis based on the Burke–Litwin model of organisational performance show convergent validity as to how a local government is performing. Results indicate that the elements of the Burke–Litwin model identified by the OPQ as acceptable and/or development areas were identical to those elements identified by participants during the individual interviews, thus establishing convergent validity. Mixed-methods research was thus shown to be a valid technique for establishing the integrity of survey data, which is propagated by Onwuegbuzie and Johnson (2006) and supported by research conducted by Collins et al. (2006) and Vitale et al. (2008).
The results also indicate that mixed-methods research provided a pragmatic advantage for gathering diagnostic data of a local government, as the qualitative responses augmented and explained survey responses, an advantage identified by Driscoll, Appiah-Yeboah, Salib and Rupert (2007). Also, mixed-methods research draws on both qualitative and quantitative data, which provided a better and holistic understanding of the functioning of an organisation, a major advantage of mixed-methods research propagated by Creswell (2014) and what Collins et al. (2006, p. 116) refer to as ‘significance enhancement’. This supports research in which mixed-methods research was used successfully for various research objectives, including the diagnosis of a pharmacy department (Paul, 1996), to assess organisational culture (Yauch & Steudel, 2003), to study relocation decision making and couple relationships (Challiol & Mignonac, 2005), to study adolescent alcohol use (Newman et al., 2006), to study social support in widowhood (Scott et al., 2007), to evaluate a school principal development programme (Youngs & Piggot-Irvin, 2012) and to study antecedents of servant leadership (Beck, 2014).
The results also indicate that the Burke–Litwin model (Burke & Litwin, 1992) is a useful and valid diagnostic framework for identifying the strengths and development areas of an organisation’s performance, as has been confirmed by other studies (Martins & Coetzee, 2009; Olivier, 2014).
The results also established the OEQ as a reliable and valid survey instrument for gathering data based on the Burke–Litwin model of organisational performance. Reliability was confirmed by an overall Pearson’s correlation coefficient of 0.76, while content validity was confirmed by the triangulation of quantitative and qualitative data, which indicated that the results of the OPQ were similar to the findings of the individual interviews. Triangulation was thus shown to be a valid technique for establishing the integrity of survey data, referred to as establishing ‘instrument fidelity’ by Collins et al. (2006, p. 116).
Limitations and recommendations for future studies
The first limitation of this study was that it utilised a concurrent transformative mixed-methods strategy using triangulation to determine convergence validity, which according to Creswell (2009, p. 214) ‘requires great effort and expertise to adequately study a phenomenon with two separate methods’. It was also found that it was a challenge to compare the results of a survey (quantitative data) with the findings of individual interviews (qualitative data). This limitation was overcome by a thorough study of mixed-methods research strategies and previous mixed-methods studies and by using a four-member research team, which ensured unbiased interpretations and comparisons of survey and interview data because of continuous cross-checking. This limitation was further overcome by using common terminology when integrating the OPQ and interview ratings, namely Acceptable: Maintain and Developmental area. A second limitation of the study was that it was conducted in only one local government in South Africa, so results may not be generalisable to other local governments or organisations in the private sector. A third limitation was that although the OPQ as a diagnostic instrument was shown to possess face and content validity, its construct validity has not yet been statistically established. Lastly, the individual interviews conducted as part of the qualitative data gathering were not audio recorded and transcribed because of the large number of interviews that had to be conducted by each of the four researchers on each day, which could have resulted in information being lost. Lastly, the diagnosis was conducted over a period of 3 months, so the municipality may have already began rectifying certain identified challenges by the time that the feedback report based on the diagnosis was compiled and presented to management.
It is recommended that a mixed-methods approach to organisational diagnosis be followed in other municipalities as well as in private sector organisations to determine the utility of this approach in diverse settings. A further recommendation is that the construct validity of the OPQ be statistically established for a variety of profit and non-profit organisations.
Practical implications for management
A mixed-methods research approach is a useful method for diagnosing the performance of organisations to ensure data integrity and to obtain a more comprehensive picture of an organisation’s performance (Collins et al., 2006; Creswell, 2014). However, the process of combining survey data with qualitative data can be time consuming and discourage managers from using this approach, opting for the quick-diagnosis survey approach. This time consuming aspect could even lead to researchers reducing sample sizes or limiting interview time, as pointed out by Driscoll et al. (2007). A further practical implication is that managers and practitioners can use the Burke–Litwin model as a basis for diagnosing the performance of an organisation with confidence, as it identifies the most important aspects of an organisation’s functioning.
Conclusion
The results of this study support the views of numerous researchers that mixed-methods research provides a more holistic and comprehensive understanding of the phenomena being studied (Beck, 2014; Challiol & Mignonac, 2005; Collins et al., 2006; Creswell, 2014; Newman et al., 2006; Paul, 1996; Scott et al., 2007; Yauch & Steudel, 2003; Youngs & Piggot-Irvin, 2012). When such an approach is based on a valid model of organisational performance such as the Burke and Litwin model (Burke & Litwin, 1992), an accurate and valid picture of an organisation’s current functioning can be obtained as a first step to improving organisational performance (Martins & Coetzee, 2009; Olivier, 2014).
Acknowledgements
Competing interests
The author declares that he has no financial or personal relationships which may have inappropriately influenced him in writing this article.
References
Andrews, M., & Shah, A. (2003). Assessing local government performance in developing countries. In A. Shah, (Ed.), Handbook of public sector performance reviews, Vol. 2 (pp. 3.2–3.26). Washington, DC: World Bank.
Atkinson, D. (2007). Taking to the streets: Has developmental local government failed in South Africa? In S. Buhlungu, J. Daniel, R. Southall, & J. Lutchman (Eds.), State of the nation: South Africa 2007 (pp. 251–286). Cape Town: Human Sciences Research Council.
Babbie, E. (2010). The practice of social research. (12th edn.). Belmont, CA: Cengage Learning.
Beck, C.D. (2014). Antecedents of servant leadership: A mixed methods study. Leadership and Organizational Studies, 21(3), 299–314. https://doi.org/10.1177/1548051814529993
Borg, I., & Mastrangelo, P.M. (2008). Employee surveys in management: Theories, tools and practical applications. Cambridge, MA: Hogrefe and Auber.
Boyne, G.A., James, O., John, P., & Petrovsky, N. (2009). Democracy and government performance: Holding incumbents accountable in English local governments. The Journal of Politics, 71(4), 1273–1284. https://doi.org/10.1017/S0022381609990089
Braun, V., & Clarke, V. (2013). Successful qualitative research: A practical guide for beginners. Thousand Oaks, CA: Sage.
Brown, D.R. (2011). An experiential approach to organizational development. (8th edn.). Upper Saddle River, NJ: Prentice Hall.
Bryman, A. (2007). Barriers to integrating quantitative and qualitative research. Journal of Mixed Methods Research, 1(1), 8–22. https://doi.org/10.1177/2345678906290531
Burke, W.W., & Litwin, G.H. (1992). A causal model of organisational performance and change. Journal of Management, 8(3), 523–546. https://doi.org/10.1177/014920639201800306
Cameron, R. (2005). Metropolitan restructuring (and more restructuring) in South Africa. Public Administration and Development, 25(4), 329–339. https://doi.org/10.1002/pad.383
Castro, M.L., & Martins, N. (2010). The relationship between organisational climate and employee satisfaction in a South African information and technology organisation. South African Journal of Industrial Psychology, 36(1), 1–9.
Challiol, H., & Mignonac, K. (2005). Relocation decision-making and couple relationships: A quantitative and qualitative study of dual-earner couples. Journal of Organizational Behavior, 26(3), 247–274. https://doi.org/10.1002/job.311
Collingridge, D. (2014). Validating a questionnaire. Retrieved February 25, 2017, from http://www.methodspace.com/validating-a-questionnaire/
Collins, K.M.T., Onwuegbuzie, A.J., & Sutton, I.L. (2006). A model incorporating the rationale and purpose for conducting mixed methods research in special education and beyond. Learning Disabilities: A Contemporary Journal, 4, 67–100.
Constitution of the Republic of South Africa. (1996). Government Gazette. (No. 17678). Pretoria: Government Printer.
Craythorne, D.L. (2006). Municipal administration: The handbook. (6th edn.). Cape Town: Juta.
Creswell, J.W. (2009). Research design: Qualitative, quantitative and mixed methods approaches. (3rd edn.). Thousand Oaks, CA: Sage.
Creswell, J.W. (2014). Research design: Qualitative, quantitative and mixed methods approaches. (4th edn.). Thousand Oaks, CA: Sage.
Cummings, T.G., & Worley, C.G. (2015). Organisation development and change. (10th edn.). Stamford, CT: Cengage Learning.
Department of Cooperative Governance and Traditional Affairs (DCOGTA). (2009). State of local government in South Africa report. Pretoria: Government Press.
Department of Cooperative Governance and Traditional Affairs (DCOGTA). (2016). Annual report for the financial year 2014/2015. Retrieved April 16, 2017, from www.cogta.gov/cgta_2016/wp-content/uploads/2016/06/COGTA-Annual-Report-2014-2015.pdf
Department of Planning, Monitoring and Evaluation. (2014). Twenty year review: South Africa 1994–2014: Background paper: Local government. Retrieved April 16, 2017, from www.dpme.gov.za/publications/…/20YR%20Local%20Government.pdf
Driscoll, D.L., Appiah-Yeboah, A., Salib, P., & Rupert, D.J. (2007). Merging qualitative and quantitative data in mixed methods research: How to and why not. Retrieved January 15, 2017, from http://digitalcommons.unl.edu/icwdmeea/18
Elo, S., Kääriäinen, M., Kanste, O., Pölkki, T., Utriainen, K., & Kyngäs, H. (2014). Qualitative content analysis: A focus on trustworthiness. Sage Open, 4(1), 1–10. https://doi.org/10.1177/2158244014522633
Falletta, S.V. (2005). Organizational diagnostic models: A review and synthesis. Sunnyvale, CA: Leadersphere.
French, W.L., Bell, C.H., & Zawacki, R.A. (Eds.). (1978). Organisation development: Theory, practice and research. Dallas, TX: Business Publications.
Ghorpade, J. (1971). Assessment of organizational effectiveness: Issues, analysis, readings. Pacific Palisades, CA: Goodyear.
Hall, R.H. (1999). Organizations: Structures, processes and outcomes. (7th edn.). Upper Saddle River, NJ: Prentice Hall.
Harrison, M.I. (2005). Diagnosing organizations: Methods, models, and processes. (3rd edn.). Thousand Oaks, CA: Sage.
Howard, A. (1994). Diagnosis for organizational change: Methods and models. New York: The Guilford Press.
IBM (2015). IBM statistics for Windows, version 23. Armonk, NW: IBM Corp.
IBM Business Consulting. (2003). 2002 Langley Research Center (LaRC) Organizational Performance Survey. Fairfax, VA: IBM Publishers.
Immordino, K.M. (2010). Organizational assessment and improvement in the public sector. Boca Raton, FL: CRC Press.
Jackson, W.U. (1984). Assessing performance in local government. Long Range Planning, 17(3), 24–31. https://doi.org/10.1016/0024-6301(84)90005-0
Johnson, R.B., Onwuegbuzie, A.J., & Turner, L.A. (2007). Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1(2), 112–133. https://doi.org/10.1177/1558689806298224
Jones, B.B., & Brazzel, M. (2006). The NTL handbook of organizational development and change: Principles, practices and perspectives. San Francisco, CA: Pfeiffer.
Koma, S.B. (2010). The state of local government in South Africa: Issues, trends and options. Journal of Public Administration, 45(1.1), 111–120.
Lee, D., & Brower, R.S. (2006). Pushing the envelope on organizational effectiveness: Combining an old framework and a sharp tool. Public Performance and Management Review, 30(2), 155–178. https://doi.org/10.2753/PMR1530-9576300202
Leech, N.L., & Onwuegbuzie, A.J. (2009). A typology of mixed methods research designs. Quality and Quantity, 43(2), 265–275. https://doi.org/10.1007/s11135-007-9105-3
Leech, N.L., Onwuegbuzie, A.J., & Combs, J.C. (2011). Writing publishable mixed methods research articles: Guidelines for emerging scholars in the health sciences and beyond. International Journal of Multiple Research Approaches, 5(1), 7–24. https://doi.org/10.5172/mra.2011.5.1.7
Leibbrandt, J.H., & Botha, C.J. (2014). Leadership and management as an enabler for strategy execution in municipalities in South Africa. Mediterranean Journal of Social Sciences, 5(20), 329–339. https://doi.org/10.5901/mjss.2014.v5n20p329
Likert, R. (1932). A technique for the measurement of attitudes. Archives of Psychology, 140, 152.
Local Government: Municipal Structures Act. (1998). Government Gazette. (No. 19614). Pretoria: Government Printer.
Locke, K. (2001). Grounded theory in management research. London: Sage.
Lusthaus, C., Adrien, M., Anderson, G., Carden, F., & Montalván, G.P. (2002). Organizational assessment: A framework for improving performance. Washington, DC: Inter-American Development Bank.
Martins, N., & Coetzee, M. (2009). Applying the Burke-Litwin model as a diagnostic framework for assessing organisational effectiveness. South African Journal of Human Resource Management, 7(1), 144–156.
Martins, N., & Geldenhuys, D. (2016). Fundamentals of organisation development. Cape Town: Juta.
Mbecke, P. (2014). Corporate municipal governance for effective and efficient public service delivery in South Africa. Journal of Governance and Regulation, 3(4), 98–106. https://doi.org/10.22495/jgr_v3_i4_c1_p2
McFee, G. (1992). Triangulation in research: Two confusions. Educational Research, 34(3), 215–219. https://doi.org/10.1080/0013188920340305
Mitlin, D. (2000). Towards more pro-poor local governments in urban areas. Environment and Urbanization, 12(1), 3–11. https://doi.org/10.1177/095624780001200101
Mortimer, N.L. (2004). The establishment and development of a performance management system in the local government sector in South Africa. Unpublished doctoral thesis, University of Kwa-Zulu Natal, Durban, South Africa.
Municipal IQ Hotspots Monitor. (2016). Municipal data and Intelligence. Retrieved March 08, 2017, from www.municipaliq.co.za/index.php?site_page=hotspots.php
Mutiro, N., & Fore, S. (2015). The perception of corporate services directorate in a metropolitan municipality on King III good governance compliance in business and projects. Journal of Governance and Regulation, 4(1), 130–140. https://doi.org/10.22495/jgr_v4_i1_c1_p4
Newman, I., Shell, D.F., Ming, Q., Jianping, X., & Maas, M.R. (2006). Adolescent alcohol use: Mixed methods research approach. Educational Psychology Papers and Publications. Paper 92. Retrieved March 17, 2017, from http://digitalcommons.unl.edu/edpsychpapers/92
Olivier, B.H. (2014). The development and validation of an assessment framework for measuring the organisational effectiveness of a metropolitan municipality in South Africa. Unpublished doctor’s thesis, University of South Africa, Pretoria, South Africa.
Onwuegbuzie, A.J., & Collins, M.T. (2007). A typology of mixed methods sampling designs in social science research. The Qualitative Report, 12(2), 281–316.
Onwuegbuzie, A.J., & Johnson, R.B. (2006). The validity issue in mixed research. Research in the Schools, 13(1), 48–63.
Paul, J. (1996). Between-method triangulation in organizational diagnosis. International Journal of Organizational Analysis, 4(2), 135–156. https://doi.org/10.1108/eb028845
Rakodi, C. (1997). Global forces, urban challenges, and urban management in Africa. In C. Rakodi (Ed.), The urban challenge in Africa: Growth and management of its largest cities (pp. 17–73). New York: United Nations University Press.
Rossman, G., & Wilson, B. (1991). Numbers and words revisited: Being shamelessly eclectic. Evaluation Review, 9(5), 627–643. https://doi.org/10.1177/0193841X8500900505
Saeed, B.B., & Wang, W. (2013). Organisational diagnoses: A survey of the literature and proposition of a new diagnostic model. International Journal of Information Systems and Change Management, 6(3), 222–238. https://doi.org/10.1504/IJISCM.2013.058328
Salkind, N.J. (2012). Exploring research. (8th edn.). Upper Saddle River, NJ: Prentice Hall.
Schmidt, D. (2010). Leadership in local governance and development. In Ethical leadership and political culture in local government: A civil society perspective on local Government in South Africa. Retrieved April 16, 2017, from http://ggln.org.za/ggln-sorg-report-july2010.pdf
Scott, S.B., Bergeman, C.S., Verney, A., Longenbaker, S., Markey, M.A., & Bisconti, T.L. (2007). Social support in widowhood: A mixed methods study. Journal of Mixed Methods Research, 1(3), 242–266. https://doi.org/10.1177/1558689807302453
Shenton, A.K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22, 63–75. https://doi.org/10.3233/EFI-2004-22201
Smit, B.R. (1999). Using the South African excellence model to focus improvement strategies in South Africa. South African Journal of Industrial Engineering, 10(1), 63–72.
South African Institute of Race Relations. (2014). The 80/20 report: Local government in 80 indicators after 20 years of democracy. Retrieved April 16, 2017, from http://www.poa.gov.za/localgovernment/Supporting%20Documentation/The%2080-20%20Report%20on%20Local%20Government-26%20May%202014.pdf
Sowa, J.E., Seldon, S.C., & Sandfort, J.R. (2004). No longer unmeasurable? A multidimensional integrated model of nonprofit organizational effectiveness. Nonprofit and Voluntary Sector Quarterly, 33(4), 711–728. https://doi.org/10.1177/0899764004269146
Tashakkori, A., & Teddlie, C. (2003). The past and future of mixed methods research: From data triangulation to mixed model designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 671–701). Thousand Oaks, CA: Sage.
Tashakkori, A., & Teddlie, C. (Eds.). (2010). Handbook of mixed methods in social and behavioral research. (2nd edn.). Thousand Oaks, CA: Sage.
Thomas, K.M. (2012). An organizational diagnosis of a centralized investigational new drug core within a large academic health center. Unpublished master’s dissertation, University of Pennsylvania, Philadelphia, PA.
Torrance, H. (2012). Triangulation, respondent validation, and democratic participation in mixed methods research. Journal of Mixed Methods Research, 6(2), 111–123. https://doi.org/10.1177/1558689812437185
Tsuchiya, K. (2016). Intervention in time of organization: Case study of organizational diagnosis during organization development. The Japanese Journal of Experimental Social Psychology, 56(1), 70–81. https://doi.org/10.2130/jjesp.si2-5
Van Thiel, S., & Leeuw, F.L. (2002). The performance paradox in the public sector. Public Performance and Management Review, 25(3), 267–281. https://doi.org/10.2307/3381236
Vitale, D.C., Armenakis, A.A., & Field, H.S. (2008). Integrating qualitative and quantitative methods for organizational diagnosis: Possible priming effects? Journal of Mixed Methods Research, 2(1), 87–105. https://doi.org/10.1177/1558689807309968
Wiley, J.W. (2010). Strategic employee surveys. San Francisco, CA: Jossey-Bass.
Yauch, C.A., & Steudel, H.J. (2003). Complementary use of qualitative and quantitative cultural assessment methods. Organizational Research Methods, 6(4), 465–481. https://doi.org/10.1177/1094428103257362
Youngs, H., & Piggot-Irvine, E. (2012). The application of a multiphase triangulation approach to mixed methods: The research of an aspiring school principal development program. Journal of Mixed Methods Research, 6(3), 184–198. https://doi.org/10.1177/1558689811420696
|