Abstract
The South African education system bears evidence of fluctuations in the final Grade 12 mathematics marks occurring across different learner profiles. This study reflected on the National Senior Certificate (NSC) mathematics results from the Western Cape Education Department for the years 2009 to 2014, the period just after the introduction of the NSC in 2008 and including the updated NSC introduced in 2014. Accordingly, this study aimed to examine the learners’ performance by socio-economic school quintile and education district for the period of 2009 to 2014, for learners in the Western Cape. Instead of the ordinary regression model, we adopted the quantile regression approach to examine the effect of school (national) quintile (NQ) type and education district at different quantiles of learner performance in the mathematics examination. The results showed that there is a significant school quintile type and education district effect on learner performance in NSC mathematics examinations for learners in the Western Cape. In some years, there were no significant performance differences between learners from NQ2 and NQ4 schools in the different quantiles. Similarly, learner performance differences for NQ3 and NQ4 schools were not significant. As we moved from 2009 to 2014, the performance difference between the lower school quintiles and the upper school quintiles narrowed, although the performance differences remained significant. These differences were smallest in 2013. This is a good sign, as it indicates that government efforts and policies, designed to narrow the historical social disparities manifested in the schools, have been somewhat successful. The identification and scrutinising of school quintile type and education district where the gap is wider will assist the government to review policies and interventions to accelerate the transformation.
Keywords: Quantile regression; school quintile; education districts; performance; Western Cape; education system.
Introduction
Although the apartheid era ended in 1994, South Africa still remains one of the most unequal countries globally (Adjaye-Gbewonyo et al., 2018; NPC, 2011). This inequality has extended into its education system and has largely contributed to the unequal educational opportunities for learners from different backgrounds (Graven, 2014; Ogbonnaya & Awuah, 2019). To redress the legacy of inequality in the education context, the schooling system is divided into government (public) schools and independent (private) schools. According to the South African Schools Act (No. 84 of 1996), public schools should be funded through public funds, while independent schools may apply for subsidies from their relevant provinces (Dass & Rinquest, 2017; Franklin, 2017). For the purpose of financial allocations, government schools have been grouped according to the socio-economic status of the community within which the school is located, into five national quintiles (NQ), from NQ1 to NQ5. Each of these quintiles caters for 20% of the learners nationally, based on a ranking of the socio-economic status, which is measured by the income, unemployment rates and illiteracy within the school’s catchment area. Schools in the poorest communities are classified as NQ1, with those in the wealthiest communities classified as NQ5 (DBE, 2006; Moses, Van der Berg, & Rich, 2017; Western Cape Education Department [WCED], 2018). Schools classified as NQ1 to NQ3 are non-fee-paying schools; these schools receive relatively more funding per learner compared to NQ4 and NQ5 schools (Dass & Rinquest, 2017; Graven, 2014). However, the accuracy in the classification of these quintiles has been questioned as the classifications are based on the socio-economic status of the school’s surrounding areas rather than the status of the households of the learners that attend the school (Ally & McLaren, 2016; Dass & Rinquest, 2017). Further concerns on the classification of the quintiles have been raised and discussed in the work of Hall and Giese (2009), Mestry and Ndhlovu (2014), among others. In addition, the provinces in South Africa are divided into geographic areas called education districts, as defined by the provincial Member of Executive Council for education. The education districts provide direct vital lines of communication with schools for the provincial education departments, for the purpose of effective education management (DBE, 2018; Motala, Dieltiens, & Sayed, 2009; SAnews, 2018). Generally, the education districts encompass different school quintile types and they cover a wider geographic location. Notably, due to the homogeneity in the rural community and the limited number of schools within the education districts, it may seem that districts and school quintile types are similar. However, in the urban education districts, the number of schools are many, with different quintiles in the same urban education district.
Despite the measures taken by the South African government to address the imbalances of the past through the funding allocation system, the performance gap for learners in the different school quintile types and education districts remains a challenge. A study by Zondo, Zewotir and North (2021), which investigates the difficulty level and discriminatory power of the 2009 National Senior Certificate (NSC) mathematics examination for learners in the Western Cape, reported that much higher abilities are needed for learners in lower quintile schools to perform well in the mathematics examination.
This study examines the mark distribution of the NSC mathematics examinations for the years 2009 to 2013, by socio-economic school quintile type and education district. Furthermore, the effects of school quintile type and education district on learner performance in mathematics under the updated NSC, introduced in 2014, is investigated. The findings from 2009 to 2013 can be contrasted with those for 2014, in order to detect changes in performance. This study can potentially serve as a baseline for future analysis of learner performance in the revised Mathematics curriculum.
Literature review
Mathematics is an important component of general education at school level (Mullis, Martin, Foy, & Arora, 2012; Reddy, 2006). Good performance in mathematics is, consequently, an important component in planning to progress from school to higher education (Banerjee, 2016; Baya’a, 1990; Devine, Fawcett, Szűcs, & Dowker, 2012; DBE, 2016). In South Africa, an observed feature of the education system is poor performance in mathematics at school level (Graven, 2014; Reddy, 2006). Studies on learner performance in mathematics have reported a range of factors associated with poor performance at school level. For example, Makgato (2007) reported that access to resources contributed to poor performance in mathematics for learners from a selected sample of schools in Pretoria, South Africa. Eide and Showalter (1998) and, later, Burnett and Farkas (2009), who analysed data collected on learners from the United States, reported that some school variables and socio-economic status had a significant negative effect on learner performance in mathematics. Consequently, these factors remain of global concern.
South Africa’s NSC, which came into effect in 2008, as a single national qualification issued across all provinces upon successfully completing Grade 12, serves as a key qualification to national higher education institutions and the working environment (Mahlobo, 2015; Sasman, 2011). In the ongoing effort to investigate learner performance in school mathematics, various researchers have identified factors associated with learner performance (Graven, 2014; Howie, 2003; Maree et al., 2006; Mji & Makgato, 2006; Ogbonnaya & Awuah, 2019; Ramohapi, Maimane, & Rankhumise, 2015; Spaull & Kotze, 2015). South Africa’s 2008–2013 indicators report on the NSC revealed that schools in the lower quintile groups accounted for the highest proportion of learners with poor performance (Umalusi, 2015). For performance in mathematics in particular, the report concluded that schools in the middle quintiles were improving, while schools in the NQ1 category continued to find the subject a steep challenge. In a study by Reddy et al. (2012), using the Trends in International Mathematics and Science Study (TIMSS) data, it was shown that in South Africa, NQ1 and NQ2 schools perform at similar levels in mathematics, but the results are generally lower than NQ3, NQ4 and NQ5 schools. Other studies further reported that learners from communities with low socio-economic status developed skills more slowly and received fewer educational returns compared to their counterparts with higher socio-economic status (Morgan, Farkas, Hillemeier, & Maczuga, 2009; Roscigno & Ainsworth-Darnell, 1999). Moreover, schools in these low socio-economic communities are often under-resourced (Mestry, 2014), which has a further negative effect on learner performance (Aikens & Barbarin, 2008). In a study by Keble (2012), using data collected in selected schools in the Port Elizabeth area, socio-economic status and school access to resources were among the factors found to correlate with poor performance in school mathematics.
In a study by Longueira (2016), who looked at the effect of the school quintile type funding system on learner performance, a significant difference was reported in the marks of learners from NQ3 and NQ1 schools, with learners from NQ3 schools performing better than their counterparts from NQ1 schools. Similar findings were reported for disparities in learner performance between NQ2 and NQ1 schools, with learners from NQ1 performing the least well. In a study by Legotlo, Maaga, and Sebego (2002) on a sample from a rural province in South Africa, it was reported that the education district in which the school is located had a significant effect on learner performance. The participants further reported ineffective policies and lack of communication between schools and education district offices and also between districts and provincial offices of basic education. Boateng (2014), who examined the technical efficiencies in delivering basic education for 13 education district offices in South Africa, concluded that the inefficiencies potentially affect education outcomes and should be given due consideration in the reform of education policies. Seemingly, school quintile type and education district have long been reliable predictors of learner performance in schools.
Theoretical framework
The theoretical basis for the analysis of the data in this study is informed by a conceptual model developed by Howie (2002; adopted from Shavelson, McDonnell, & Oakes, 1987), as well as other research discussed in the literature review (Boateng, 2014; Graven, 2014; Keble, 2012; Reddy et al., 2012). The model presents factors related to learner performance in school mathematics within the South African context in terms of inputs, processes and outputs in the education system. The inputs include factors related to geospatial contexts at provincial and district levels. The processes in the education system consider the socio-economic situation within the geospatial contexts. Learner performance is then used to measure outputs. Accordingly, for this study, education district, school quintile type and learner performance serve as inputs, processes and outputs.
The data used in this study are that of the NSC mathematics examination, made available for each year from learners in the Western Cape who wrote the examination at the commencement of the NSC in 2009 until 2013 and also after the commencement of the updated NSC in 2014 (Grussendorff, Booyse, & Burroughs, 2014; Ramatlapana & Makonye, 2012; Umalusi, 2015).
Objectives of the study
This study reports on differences in the raw unadjusted learner marks in the NSC mathematics examination for the period of 2009 to 2014, and focuses primarily on the effects of the school quintile type and education districts for learners in the Western Cape on the mentioned differences of interest. The following questions are addressed in the study:
- What are the effects of socio-economic school quintile type on learner performance in the NSC mathematics examination for the period 2009 to 2014?
- What are the effects of socio-economic education district on learner performance in the NSC mathematics examination for the period 2009 to 2014?
- What are the differences in learner performance in the NSC mathematics examination for learners from the different socio-economic school quintile types and education districts, across the years 2009 to 2014?
Research methodology
A quantitative research approach was used in this study to investigate performance variations among learners who wrote the NSC mathematics examination for the period 2009 to 2014, in the Western Cape. A regression analysis approach was used to analyse the effects of school quintile type and education district on learner performance in the examinations.
Regression analysis is classically used to find a relationship between a dependent variable (Y) and p predictor variables X1, X2, …, Xp. The ordinary regression model takes the form:
ε is the error term of the model assumed to be normally distributed with mean zero and variance of σ2. The regression coefficients β1, β2, … , βp measure the change in the average value of Y for a unit change of X1, X2, …, Xp, while keeping the other variables fixed. The estimation of the regression coefficients β1, β2, … , βp is often performed using the ordinary least squares method which minimises the error sum of squares (Hao & Naiman, 2007a; Zhang, 2009). However, the ordinary regression model has a number of limitations, in particular when the data are skewed (not normally distributed), multimodal or contain a high number of outliers. Moreover, as noted by Hao and Naiman (2007b), the ordinary regression model cannot answer research questions at lower or upper values of the dependent variable (Y). In other words, the ordinary regression model deals with the mean value of the dependent variable at different combinations of the predictor variables. The quantile regression alleviates such limitations of the ordinary regression model by estimating the effect of the different combinations of the predictor variables (X1, X2, …, Xp) at different quantile levels of the dependent variable (Y).
The quantile level is the probability or the proportion of the population that is associated with a quantile and the corresponding conditional quantile of the dependent variable Y is denoted by Qτ(Y|X). That is, the quantile level, often denoted as τ ∈ [0,1], is the value of the dependent variable below which the proportion of the conditional response population is τ (Hao & Naiman, 2007a; Rodriguez & Yao, 2017). The regression model of the dependent variable (Y) on a set of independent variables X1, X2, …, Xp for quantile level τ is given by:
The regression coefficients β1 (τ) + β2 (τ) + … + βP (τ) are estimated by minimising the following objective function (Baum, 2013):
From the quantile regression approach, it may be found that the relationships between the dependent and predictor variables differ at each quantile level τ, that is, the objective function yields different regression coefficient estimates at for each quantile level τ (Favero & Belfiore, 2019; Koenker & Bassett, 1978). Quantile regression generalises terms such as quartile, decile, quantile and percentile (Hao & Naiman, 2007b). Quantile regression is a type of regression which is used for modelling conditional dependent variables and aims at estimating either the conditional median, which is a special case of quantile regression, or other quantiles of the dependent variable, which can be used to describe non-central positions of the distribution. The method of ordinary regression model estimation results in estimates that approximate the conditional mean of the dependent variable, given certain values of the predictor variables, while the quantile regression model specifies the conditional quantile function. One advantage of quantile regression, relative to ordinary regression, is that the quantile regression estimates are more robust against outliers (Koenker & Bassett, 1978). This means that quantile regression can provide reliable regression estimates in the presence of extreme values in the data set.
The data
The data used in this study include the individual raw, unadjusted learner examination marks and information on the learners’ school characteristics, the socio-economic school quintile type (NQ1, NQ2, NQ3, NQ4, NQ5 and independent schools) and education district. The Western Cape education districts include the West Coast, Cape Winelands, Eden and Central Karoo, Overberg, Metro North, Metro South, Metro East and Metro Central (WCED, 2018). The data do not include records relating to the Overberg district, as they were not available in the provided data set at the time of analysis. In the years 2009 to 2013, the core content of the Mathematics curriculum was examined by means of two compulsory examination papers (Paper 1 and Paper 2), with optional material being examined in Paper 3. The data do not include records relating to learner marks in the optional Paper 3 examination, as they were not available at the time of analysis; moreover, this was not a compulsory paper and consequently it was not written by all learners. The dependent variable is the unadjusted mathematics examination mark for each learner, calculated as the average mark for Paper 1 and Paper 2, which were written by all learners in the study, and was presented as a percentage. The predictor variables in the study were school quintile type and education district. The mark is unadjusted, in the sense that it is the raw mark that the individual learner scored for the examination, and does not include marks for the continuous assessment component.
Final NSC marks are (potentially) standardised by Umalusi, the national standardisation body, to adjust for variations resulting from standards in the marking process and other factors in the examination process from one year to another and from one examination body to the next (Umalusi, 2011). In a press release, Umalusi (2011) mentioned that the standardisation of the learner marks, by subject, is a sophisticated statistical model that increases or decreases the learner marks by a proportion of their total mark in the subject. It was further stated that a high percentage of the NSC subject’s raw examination marks are accepted by Umalusi and are not adjusted. For example, in 2010, raw unadjusted learner examination marks were accepted for 67.24% of the NSC subjects (Umalusi, 2011).
Ethical consideration
The data for this study were collected for a special project, run by a senior individual in the WCED in South Africa. Special permission was obtained for the use of the anonymised data for this research study, which was personally sent to the authors by the senior individual in the WCED. Ethical clearance was not required for this study, as the data were properly anonymised by the WCED in such a way that no identity number or examination number could be identified with either individual learner or school name.
Data exploration
Learner marks used in this study were the raw percentages a learner in the Western Cape obtained in the NSC mathematics examination. Overall, there was an increase in the mean marks from 2009 to 2013, with a slight decrease in 2014, the year in which the updated NSC was introduced. That is, the average mathematics examination marks for learners in the Western Cape increased during the period 2009 to 2013; however, at the introduction of the updated NSC, the average learner marks in the mathematics examination were negatively affected (marks dropped). As shown in Figure 1, the distribution of the marks varied by school quintile type. Learners who attended either NQ5 government schools or independent schools had considerably higher marks than learners from NQ1 to NQ4 schools. In addition, the spread of the marks varied by school quintile type, with learners from the lower quintile schools having a slightly more homogeneous distribution of marks, characterised by a relatively smaller range of marks, with associated smaller standard deviation. These patterns were observed across all the years from 2009 to 2014. Variations in the spread of the learner marks, according to the education districts, are presented in Figure 2. From Figure 2, it is evident that for the period 2009 to 2010, the mark distribution was positively skewed, but not equally so, for all education districts. That is, the marks were generally more concentrated in the lower end of the distribution. Similar patterns were further observed for the years 2011 and 2012 for learners who attended schools in the Metro Central, Metro South and West Coast education districts. The mark distribution for learners who attended schools in the Metro East education district was spread over a wider range across the years, with the marks being slightly negatively skewed in 2013. Overall, Figure 1 and Figure 2 indicate a skewed distribution evident in the learners’ mathematics marks for learners in the Western Cape.
|
FIGURE 1: Distribution of mathematics examination marks by school quintile type 2009–2014 for learners in the Western Cape. |
|
|
FIGURE 2: Distribution of mathematics examination marks by education district 2009–2014 for learners in the Western Cape. |
|
Results
This section reports on the results from the ordinary regression and the quantile regression model techniques applied to the NSC mathematics marks. The effect of the different school quintile types and education districts on these marks was then scrutinised. The variations in learner marks in the NSC mathematics examination were examined using ordinary regression, presented along the 0.3, 0.4, 0.5, 0.6 and 0.8 quantiles. The results are presented in Table 1. Accordingly, the predictor variables included in the model are the school quintile type (a six-level categorical variable) and the education district (a seven-level categorical variable). For school quintile type, the NQ4 category serves as a reference (baseline) category, that is, the regression parameters of the indicator variables measure the effect on mathematics performance, relative to the reference category. The selection of a reference category does not change the results; any category can be chosen to be the reference category (El-Habil, 2012; Schafer, 2006). Similarly, for education district, Cape Winelands serves as the reference category. Using a different reference category would not change the results as they would be relative to the reference category, that is, the model would fit equally well, producing the same likelihood (Schafer, 2006).
TABLE 1a: Results from the estimation of the determinants of learner marks in the National Senior Certificate mathematics. |
TABLE 1b: Results from the estimation of the determinants of learner marks in the National Senior Certificate mathematics. |
TABLE 1c: Results from the estimation of the determinants of learner marks in the National Senior Certificate mathematics. |
In examining the goodness of fit assessment results (presented in Figure 3), we note that the quantile regression model fits the data adequately. From the outlier diagnostics summary, it is observed that 4.5% of the observations are greater than 1.5 times the interquartile range (IQR) above the upper quartile (i.e. Q3 + 1.5 * IQR). If the distribution was normally distributed, we would have fewer than 2.5% such outliers present. Accordingly, the ordinary regression model may not be the ideal model for the performance analysis. Consequently, quantile regression was judged to be a better model fit for such data, since it does not make any assumptions from the distribution of the error terms in the model (Rodriguez & Yao, 2017). All analyses for this study were conducted using the SAS 9.4 software.
|
FIGURE 3: Diagnostic plots for the quantile regression model. |
|
The effect of socio-economic school quintile type on the performance of learners in the NSC mathematics examination for the period 2009 to 2014 in the Western Cape
Table 1 presents the results. These are presented for the ordinary regression model and for each quantile in the quantile regression results. Some of the results from the ordinary regression model accord with studies in literature which analysed the final adjusted learner marks, hence they show that school quintile type is a significant determinant of learner performance. The estimated parameters represent the differences in the marks between learners in the specified school quintile types and learners in the NQ4 schools when the effect of education district is controlled for. For instance, in 2009, learners from NQ2 schools achieved 3.4 percentage points lower on average, compared to those from NQ4 schools, while those from NQ3 schools achieved 4.3 percentage points lower than those from learners in NQ4 schools. The estimates for NQ2 schools were marginally higher than those of NQ3 schools in comparison to NQ4 schools for the years 2009, 2011 and 2013. The analysis shows that for the years 2010 and 2012, learners from NQ1, NQ2 and NQ3 schools, on average, had significantly lower marks than their NQ4 counterparts, while those from NQ5 and independent schools presented as having significantly higher marks. These results are consistent with findings from the literature. However, in 2013, the ordinary regression results showed that learner marks from the NQ2 and NQ3 schools were not significantly different from those of learners from NQ4 schools. This contradicts findings in the literature, on a study using TIMSS 2011 data, where NQ4 learners were reported to perform better than those from NQ2 schools (Reddy et al., 2012). In 2014, the marks of learners in NQ3 schools were found to be similar to those of learners in NQ4 schools. In other words, there is no significant difference in the parameter estimates.
Quantile regression provides snapshots at different quantile levels, hence adding much more value to the relationship between the dependent variable and the predictor variables (Baum, 2013). The results from the quantile regression analysis show that the effect of school quintile type and education district on learner marks in the NSC mathematics examination varied across the quantiles. While these can be read across the rows in Table 1, the pattern of the effects is illustrated in Figure 4 and Figure 5. From Table 1, the estimated parameters for school quintile type obtained by the ordinary regression model remain constant for the different school quintile types as they are based on the conditional mean of the dependent variable. In contrast, the estimated parameters, obtained using quantile regression, varied as the analysis is done at a different point of the dependent variable. For instance, for 2009, the ordinary regression model estimate for NQ3 schools remained constant at –4.3, while when using quantile regression, the estimates decreased across quantile from –4.0 in quantile 0.3, to –8.0 in quantile 0.8.
|
FIGURE 4: Parameter estimate differences between the specified school quintile type and NQ4 schools across the different quantiles. |
|
|
FIGURE 5: Parameter estimate differences between the specified education district and the Cape Winelands education district across the different quantiles. |
|
From the quantile regression results presented in Table 1, it is observed that in the years 2009 to 2011, learners who attended NQ1, NQ2 and NQ3 schools had significantly lower marks than those who attended NQ4 schools. This is evidenced by the negative parameter estimates, with an effect that is greater in the upper quantiles (0.6 and 0.8). The performance differences between the learners who attended NQ1 and NQ4 schools range from 7 to 13 percentage points, while the differences in performance between NQ2 and NQ4 learners range from 3 to 8 percentage points in the upper quantiles. Similarly, in 2012, learners who went to NQ4 schools scored significantly higher marks than their counterparts in NQ1, NQ2 and NQ3 schools. In particular, the difference between NQ3 and NQ4 schools is about 4 percentage points at the median (0.5 quantile) and 6 percentage points in quantile 0.8.
For the period 2013 to 2014, it is worth noting that in 2013, there was no evidence of significant performance differences between learners in NQ2 and NQ4 schools across all quantiles. That is, in the estimated parameters from quantile regression analysis presented in Table 1 for 2013, there is no statistically significant difference in the estimates of NQ2 and NQ4 school quintile types, and this is true at all quantiles of interest in the study. Furthermore, in 2013, the differences in mathematics marks for learners in the NQ3 and NQ4 schools exist only at the median (quantile 0.5), but not in the lower and upper quantiles. Unlike 2009–2013, in 2014, the year in which the updated NSC was introduced, the performance difference between learners from NQ4 schools from that of their NQ3 counterparts is not significant in the lower end and upper end quantiles. The estimated parameters for NQ2 and NQ3 schools in the quantile regression analysis vary within a small range. The marks for learners in NQ5 schools were significantly higher than those from NQ4 schools, with these findings between the NQ4 and NQ5 schools being echoed in the old NSC mathematics examination results, 2009 to 2013. Differences in these marks were largest in the upper quantiles. Similar observations were made for learners from independent schools.
The effect of education district on the performance of learners in the NSC mathematics examination for the period 2009 to 2014 in the Western Cape
The ordinary regression model results show that the education district had a significant effect on the learner marks obtained for the years 2009 to 2014. Learners who attended schools in the Metro East and Metro North education districts were found to have significantly higher marks than learners who attended schools in the Cape Winelands education district. Specifically, in 2009, learners who attended schools in the Metro East education district achieved 7.4 percentage points higher marks on average than those who attended schools in the Cape Winelands. In 2013 the marks achieved for these two education districts indicated an increase, a difference of 8.6 percentage points higher for learners in the Metro East compared to those in the Cape Winelands.
The quantile regression parameter estimates for the education district variable are presented in Table 1 and Figure 5. Different from the school quintile type effect, it is observed that learners from the Metro East and Metro North education districts performed significantly better than learners from the Cape Winelands education district, across the different quantiles, for all the years in the study. Learners who attended schools in the Metro Central education districts generally had marks that were statistically significantly lower than those of the Cape Winelands education district, for the years 2009 to 2012. In 2009, there was an estimate increase, associated with increasing quantiles. The parameter estimates for the Metro Central education district were negative in the lower quantiles and positive in the upper quantiles for the years 2009, 2010 and 2014. That is, the improved performance that learners from the Cape Winelands had over their Metro Central counterparts was largest in quantile 0.8. This may be due to more NQ5 category schools being located in the Cape Winelands education district, relative to the Metro Central education district. In 2013, the Metro East education district effects were largest in the lower end quantiles and lowest in the upper quantile, with 10 percentage points higher in quantiles 0.3 and 0.4, and 6 percentage points higher in quantile 0.8, when compared to the marks of learners in the Cape Winelands as presented in Table 1.
Differences in learner performance by socio-economic school quintile type and education district across quantile and across the years 2009 to 2014
Upon obtaining the parameter estimates for our predictor variables from our quantile regression investigation, we tested the equivalence of the parameters within each predictor variable, across the different quantiles. The test result indicated a rejection of the equivalence of the effect of the predictors across the different quantile levels (the p-value < 0.05). This suggests that the magnitude of impact that school quintile type has on learner performance in the NSC mathematics examination changes from one quantile to another. Similar equivalence tests were done for educational district effects, and it was concluded that the effect of education district significantly varied across quantile. The test for heteroscedasticity in linear models, based on quantile regression statistics of Koenker and Bassett (1982), was used.
We adopted Cumming’s (2009) approach to test the equivalence of the school quintile type effect across the years. According to Cumming, if there is no overlap in the 95% confidence intervals of the parameter estimates, then the two parameters are statistically significantly different from each other; if the two confidence intervals ‘just touch’, there is a significant effect of p of about 0.01. This is known as ‘the rule of eye’ (Cumming, 2009). Figure 6 presents the 95% confidence intervals for the parameter estimates of the school quintile types across the different years. Accordingly, the intervals that overlap and the extent of the overlap were identified. Using ‘the rule of eye’, the equality of some pairwise comparisons of the estimates across the years for the different school quintile types was rejected at a 5% level of significance. In particular, for NQ1 schools, in the 0.3, 0.4 and 0.6 quantiles, it was noted that parameter estimates for the year 2010 differed significantly from those of the other years (2011–2014). The NQ5 parameter estimates, from quantile regression, in the years 2009 and 2010 differed significantly across the quantiles, while for the independent schools, the estimates in the years 2009 and 2010 indicate a significant overlap across the quantiles.
|
FIGURE 6: Parameter estimates and 95% confidence limits for the school quintile types across the years. |
|
Another interesting result from the test of differences between parameter estimates across the years is that of NQ3 relative to NQ4 schools. For NQ3 schools, there were no significant differences found for the years 2013 and 2014, across the quantiles. That is, NQ3 schools were not affected by the updated NSC, which came into effect from 2014. Similar observations are made for NQ1 and NQ2 schools, when comparing parameter estimates of 2013 and 2014.
Conclusion
Much of the existing literature has shifted to the analysis of how learner performance in mathematics influences their performance in higher institutions of learning. This is evident, for example, in the work by Zewotir, North and Murray (2011) and Singh, Granville and Dika (2002). In this study we analysed learner performance in school mathematics with the aim of addressing the effects of school quintile type and education districts. We applied the quantile regression method to the NSC mathematics examination results from the WCED for the period 2009 to 2014, a period during which the NSC was introduced and further revised. Several methods exist to examine learner performance in mathematics. Past national studies used traditional regression methods (Howie, 2003; Maree et al., 2006) for the analysis of learner performance, which may not be valid if certain model assumptions are not satisfied.
The results in this study reported that the extent to which school quintile type and education district influenced learner performance in the NSC mathematics examination varied across the quantiles. The performance of learners from NQ1, NQ2 and NQ3 schools generally differs from that of NQ5 and independent schools. The results presented in this study are similar to those of Reddy et al. (2012) who reported on the TIMSS 2011 data set that learners from NQ1 and NQ2 schools performed at lower levels than those in NQ4 and NQ5 schools. Interestingly, findings from our study reveal that in the later years (i.e. 2013 and 2014) the performance differences between learners from NQ4 schools and those of NQ2 and NQ3 schools tend to be insignificant in the lower end and upper end quantiles. This is perhaps due to the government’s continual effort or the schools’ intervention strategies to prepare the learners for the national examination in order to narrow the historical socio-economic disparities manifested in the schools.
This study had the limitation of relying on the 2009–2014 data set. However, as this was the period just after the introduction of the NSC, and included the updated NSC introduced in 2014, it is with little doubt that this study can serve as a baseline on how the performance differences between the socio-economic school quintile types and the education districts change over the years as the NSC becomes a national norm. Particularly, this study sheds light on how government policies or the educators’ efforts have been somewhat successful in narrowing the historical educational inputs and systems disparities manifested in schools. In other words, as the educational inputs and systems differences are minimised among the socio-economic school quintile types, the performance differences between the schools will be insignificant. Despite limitations thereof, findings from this study make an important contribution to existing literature and initiate insights and a national debate on how government can review policies and interventions to accelerate transformation between the socio-economic school quintile types and education districts where the performance gap is wide. Indeed, more detailed longitudinal studies are needed to generalise to later years and to learners across the country. Accordingly, one of the future directions of this study is to examine more recent patterns and trend differences among the low, intermediate and high performers, in all school quintiles in South Africa.
Acknowledgements
The authors are grateful to Mr Brian Schreuder, Superintendent-General of the Western Cape Education Department, for permission to use their data.
Competing interests
The authors have declared that no competing interest exists.
Authors’ contributions
All authors contributed equally to this work.
Funding information
N.Z. would also like to thank the National Research Foundation of South Africa, Teaching Development Grant (TDG) and Universities Capacity Development Programme (UCDP) for ongoing financial support.
Data availability statement
Data sharing is not applicable as no new data were created.
Disclaimer
The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.
References
Adjaye-Gbewonyo, K., Kawachi, I., Subramanian, S.V., & Avendano, M. (2018). Income inequality and cardiovascular disease risk factors in a highly unequal country: A fixed-effects analysis from South Africa. International Journal for Equity in Health, 17(1), 31. https://doi.org/10.1186/s12939-018-0741-0
Aikens, N.L., & Barbarin, O. (2008). Socioeconomic differences in reading trajectories: The contribution of family, neighborhood, and school contexts. Journal of Educational Psychology, 100(2), 235. https://doi.org/10.1037/0022-0663.100.2.235
Ally, N., & McLaren, D. (2016 November). Fees are an issue at school too, not just university. GroundUp. Retrieved January 31, 2019, from https://www.groundup.org.za/article/fees-are-issue-school-too-not-just-university/
Banerjee, P.A. (2016). A systematic review of factors linked to poor academic performance of disadvantaged students in science and maths in schools. Cogent Education, 3, 1178441. https://doi.org/10.1080/2331186X.2016.1178441
Baum, C.F. (2013). Quantile regression. Retrieved from http://fmwww.bc.edu/EC-CS2013
Baya’a, N.F. (1990). Mathematics anxiety, mathematics achievement, gender, and socio-economic status among Arab secondary students in Israel. International Journal of Mathematical Education in Science and Technology, 21(2), 319–324. https://doi.org/10.1080/0020739900210221
Boateng, N.A. (2014). Technical efficiency and primary education in South Africa: Evidence from sub-national level analyses. South African Journal of Education, 34(2), 18. https://doi.org/10.15700/201412071117
Burnett, K., & Farkas, G. (2009). Poverty and family structure effects on children’s mathematics achievement: Estimates from random and fixed effects models. The Social Science Journal, 46(2), 297–318. https://doi.org/10.1016/j.soscij.2008.12.009
Cumming, G. (2009). Inference by eye: Reading the overlap of independent confidence intervals. Statistics in Medicine, 28(2), 205–220. https://doi.org/10.1002/sim.3471
Dass, S., & Rinquest, A. (2017). School fees. In F. Verlava, A. Thom, & T. Hodgson (Eds.), Basic education rights handbook: Education rights in South Africa (pp. 140–159). Braamfontein: Section 27.
Devine, A., Fawcett, K., Szűcs, D., & Dowker, A. (2012). Gender differences in mathematics anxiety and the relation to mathematics performance while controlling for test anxiety. Behavioral and Brain Functions, 8, 1–9. https://doi.org/10.1186/1744-9081-8-33
Department of Basic Education (DBE). (2006). National norms and standards for school funding. Pretoria: DBE.
Department of Basic Education (DBE). (2016). National Senior Certificate Examination Report 2015. Pretoria: DBE.
Department of Basic Education (DBE). (2018). Education districts. Retrieved from https://www.education.gov.za/Informationfor/EducationDistricts.aspx
Eide, E., & Showalter, M.H. (1998). The effect of school quality on student performance: A quantile regression approach. Economics Letters, 58(3), 345–350. https://doi.org/10.1016/S0165-1765(97)00286-3
El-Habil, A.M. (2012). An application on multinomial logistic regression model. Pakistan Journal of Statistics and Operation Research, 8(2), 271–291. https://doi.org/10.18187/pjsor.v8i2.234
Favero, L.P., & Belfiore, P. (2019). Data science for business and decision making. London: Academic Press.
Franklin, S. (2017). School fees. In F. Verlava, A. Thom, & T. Hodgson (Eds.), Basic education rights handbook: Education rights in South Africa (pp. 140–159). Braamfontein: Section 27.
Graven, M.H. (2014). Poverty, inequality and mathematics performance: The case of South Africa’s post-apartheid context. ZDM, 46, 1039–1049. https://doi.org/10.1007/s11858-013-0566-7
Grussendorff, S., Booyse, C., & Burroughs, E. (2014). What’s in the CAPS package. A comparative study of the National Curriculum Statement (NCS) and the Curriculum and Assessment Policy Statement (CAPS): FET Phase. Pretoria: Umalusi.
Hao, L., & Naiman, D.Q. (2007a). Quantile regression. London: Sage. https://doi.org/10.4135/9781412985550
Hao, L., & Naiman, D.Q. (2007b). Quantitative applications in the social sciences (Vol. 149). London: Sage.
Howie, S. (2002). English language proficiency and contextual factors influencing mathematics achievement of secondary school pupils in South Africa. PhD thesis. Enschede: University of Twente.
Howie, S.J. (2003). Language and other background factors affecting secondary pupils’ performance in Mathematics in South Africa. African Journal of Research in Mathematics, Science and Technology Education, 7(1), 1–20. https://doi.org/10.1080/10288457.2003.10740545
Keble, J.A. (2012). An investigation into the low pass rate in science and mathematics in selected schools in the northern areas, Port Elizabeth. Masters dissertation. Port Elizabeth: Nelson Mandela Metropolitan University.
Koenker, R., & Bassett, Jr., G. (1978). Regression quantiles. Econometrica: Journal of the Econometric Society, 46(1), 33–50. https://doi.org/10.2307/1913643
Koenker, R., & Bassett, Jr., G. (1982). Robust tests for heteroscedasticity based on regression quantiles. Econometrica: Journal of the Econometric Society, 50(1), 43–61. https://doi.org/10.2307/1912528
Legotlo, M., Maaga, M., & Sebego, M. (2002). Perceptions of stakeholders on causes of poor performance in Grade 12 in a province in South Africa. South African Journal of Education, 22(2), 113–118.
Longueira, R. (2016). Exploring the functionality of the South African education quintile funding system. Pretoria: University of Pretoria.
Mahlobo, R. (2015). National benchmark test as a benchmark tool. Pretoria: Unisa Press.
Makgato, M. (2007). Factors associated with poor performance of learners in mathematics and physical science in secondary schools in Soshanguve, South Africa. Africa Education Review, 4(1), 89–103. https://doi.org/10.1080/18146620701412183
Mestry, R. (2014). The state’s responsibility to fund basic education in public schools. International handbook of educational leadership and social (in) justice (pp. 1081–1101). London: Springer. https://doi.org/10.1007/978-94-007-6555-9_54
Mestry, R., & Ndhlovu, R. (2014). The implications of the national norms and standards for school funding policy on equity in South African public schools. South African Journal of Education, 34(3), 11. https://doi.org/10.15700/201409161042
Maree, K., Aldous, C., Hattingh, A., Swanepoel, A., & Van der Linde, M. (2006). Predictors of learner performance in mathematics and science according to a large-scale study in Mpumalanga. South African Journal of Education, 26(2), 229–252.
Mji, A., & Makgato, M. (2006). Factors associated with high school learners’ poor performance: A spotlight on mathematics and physical science. South African Journal of Education, 26(2), 253–266.
Morgan, P.L., Farkas, G., Hillemeier, M.M., & Maczuga, S. (2009). Risk factors for learning-related behavior problems at 24 months of age: Population-based estimates. Journal of Abnormal Child Psychology, 37, 401. https://doi.org/10.1007/s10802-008-9279-8
Moses, E., Van der Berg, S., & Rich, K. (2017). A society divided: How unequal education quality limits social mobility in South Africa. Synthesis report. Stellenbosch: Programme to Support Pro-Poor Policy Development (PSPPD).
Motala, S., Dieltiens, V., & Sayed, Y. (2009). Physical access to schooling in South Africa: Mapping dropout, repetition and age-grade progression in two districts. Comparative Education, 45(2), 251–263. https://doi.org/10.1080/03050060902920948
Mullis, I.V., Martin, M.O., Foy, P., & Arora, A. (2012). TIMSS 2011 international results in mathematics. Washington, DC: ERIC.
National Planning Commission (NPC). (2011). Diagnostic overview. Pretoria: The Presidency.
Ogbonnaya, U.I., & Awuah, F.K. (2019). Quintile ranking of schools in South Africa and learners’ achievement in probability. Statistics Education Research Journal, 18(1), 106–119.
Ramatlapana, K., & Makonye, J. (2012). From too much freedom to too much restriction: The case of teacher autonomy from National Curriculum Statement (NCS) to Curriculum and Assessment Statement (CAPS). Africa Education Review, 9(Suppl 1), S7–S25. https://doi.org/10.1080/18146627.2012.753185
Ramohapi, S., Maimane, J., & Rankhumise, M. (2015). Investigating factors contributing to learner performance in mathematics: A case study of some selected schools in Motheo District. International Journal of Educational Sciences, 8(3), 445–451. https://doi.org/10.1080/09751122.2015.11890266
Reddy, V. (2006). Mathematics and science achievement at South African schools in TIMSS 2003. Cape Town: HSRC Press.
Reddy, V., Prinsloo, C., Arends, F., Visser, M., Winnaar, L., Feza, N., … Mthethwa, M. (2012). Highlights from TIMSS 2011: The South African perspective. Cape Town: HSRC Press.
Rodriguez, R.N., & Yao, Y. (2017). Five things you should know about quantile regression. In Proceedings of the SAS Global Forum 2017 Conference. Cary, NC: SAS Institute Inc. Retrieved from http://support.sas.com/resources/papers/proceedings17/SAS525-2017.pdf
Roscigno, V.J., & Ainsworth-Darnell, J.W. (1999). Race, cultural capital, and educational resources: Persistent inequalities and achievement returns. Sociology of Education, 72(3), 158–178. https://doi.org/10.2307/2673227
SAnews. (2018, January). Focus on district performance pays off. Portal Publishing. Retrieved September 12, 2019 from https://www.skillsportal.co.za/content/focus-district-performance-pays
Sasman, M. (2011). Insights from NSC mathematics examinations. In H. Venkat & A.A. Essien (Eds.). Proceedings of the 17th Annual AMESA Congress (Vol. 1. pp. 168–177), University of the Witwatersrand, Johannesburg, 11–15 July 2011.
Schafer, J.L. (2006). Multinomial logistic regression models. STAT 544 – Lecture 19. Wayne State University, Detroit, MI. Retrieved from https://socialwork.wayne.edu/research/pdf/multi-nomial-logistic-regression.pdf
Shavelson, R.J., McDonnell, L.M., & Oakes, J. (1989). Indicators for monitoring mathematics and science education: A sourcebook. Santa Monica, CA: Rand Corporation.
Singh, K., Granville, M., & Dika, S. (2002). Mathematics and science achievement: Effects of motivation, interest, and academic engagement. The Journal of Educational Research, 95, 323–332. https://doi.org/10.1080/00220670209596607
Spaull, N., & Kotze, J. (2015). Starting behind and staying behind in South Africa: The case of insurmountable learning deficits in mathematics. International Journal of Educational Development, 41, 13–24. https://doi.org/10.1016/j.ijedudev.2015.01.002
Umalusi. (2011). Umalusi explains matric mark adjustments. Retrieved from http://www.politicsweb.co.za/documents/umalusi-explains-matric-mark-adjustments
Umalusi. (2015). Indicators report 2008–2013 National Senior Certificate. Pretoria: Umalusi.
Western Cape Educaion Department (WCED). (2018). Western Cape Education Department. Retrieved from https://wcedonline.westerncape.gov.za/contact/districts
Zewotir, T., North, D., & Murray, M. (2011). Student success in entry level modules at the University of KwaZulu-Natal. South African Journal of Higher Education, 25(6), 1233–1244.
Zhang, X. (2009). Improving the profitability of direct marketing: A quantile regression approach. Master’s thesis, Lingnan University, Hong Kong. Available from http://doi.org/10.14793/mkt_etd.5
Zondo, N., Zewotir, T., & North, D. (2021). The level of difficulty and discrimination power of the items of the National Senior Certificate Mathematics Examination. Manuscript submitted for publication.
|