key: cord-1050162-hqpl2cxr authors: Lange, Christopher; Almusharraf, Norah; Koreshnikova, Yuliya; Costley, Jamie title: The effects of example-free instruction and worked examples on problem-solving date: 2021-08-16 journal: Heliyon DOI: 10.1016/j.heliyon.2021.e07785 sha: e8b401e2eed8ad40384f2f0b9ca703775f6e2f47 doc_id: 1050162 cord_uid: hqpl2cxr The use of worked examples has been shown to be an effective instructional method for reducing cognitive load and successfully engaging in problem-solving. Extant research often views worked examples as an integrated part of direct instruction. Studies have examined the problem-solving effects of worked examples used in tandem with instructional explanations. However, a gap exists in research focusing on the individual problem-solving effects of example-free instructional explanations and worked examples containing no instructional explanation. This study uses a method in which worked examples are separated from direct instruction to examine the problem-solving effects of individual parts of such instruction, namely example-free instruction and worked examples containing no instructional explanation. Considering the importance of critical thinking skills in the current educational environment, the current study was conducted on a group of university students (n = 32) studying critical thinking in South Korea. Results showed that example-free instruction was more effective for problem-solving than worked examples containing no instructional explanation. Additionally, participants reported more efficient cognitive processing ability when critical thinking problems were presented through instructional explanation rather than worked examples. These results allow for a granular look at the different aspects of direct instruction and their effects on cognitive load and problem-solving. When it comes to problem-solving, instructional design has traditionally focused on both conceptual explanations through direct instruction as well as the presentation of worked examples as part of example-based instruction (Chen et al., 2019; Chen and Kalyuga, 2020; Kirschner et al., 2006) . Previous studies have tended to look at these aspects of instructional design as an integrated whole, with variations in the worked examples and direct instruction being somewhat assumed. As such, the terminology of these instructional design principles has often been conflated, as providing learners with worked examples is oftentimes considered part of direct instruction (Schwartz et al., 2011) and vice versa (Schalk et al., 2020) . While previous studies are useful in showing how worked examples integrated with instructional explanation affect learning in general and problem-solving specifically, breaking down the instructional design aspect into separate parts such as example-free instruction and purely example-based instruction allows for the examination of how those individual parts affect problem-solving. Example-free instruction may be considered a specific form of direct instruction that focuses only on delivering pure conceptual explanations free from examples. In contrast, purely example-based instruction may be considered the presentation of worked examples free from any instructional explanation. One of the most important tasks of education is to teach students how to use and understand information. Students need guidance on how to sort and summarize information to avoid information overload and assess the accuracy and reliability of the information so that effective problem solving can occur (Mohr and Mohr, 2017) . Problem-solving is the process of determining a problem, discovering the cause of the problem, classifying, prioritizing, and deciding alternatives for an explanation, and implementing a solution (Retnowati et al., 2017) . Problem-solving is ultimately encouraged through both direct instruction and worked examples. Direct instruction generally encourages problem-solving by emphasizing organized, explicit instruction, repetition of content, and mastery of prior knowledge before moving on to more demanding content (Engelmann, 1980 (Engelmann, , 2014 Engelmann et al., 1988) . Worked examples encourage problem-solving by introducing a problem statement along with the presentation of the solution through a series of procedural steps to offer a high level of support to students (Chen et al., 2019) . The effect of both direct instruction and worked examples on problem-solving is commonly explained through cognitive load theory. Cognitive load refers to the sum of information that the working memory can process simultaneously (Klepsch and Seufert, 2020) . Cognitive load theory consists of three main elements, all of which play a role in learners' ability to solve problems. Extraneous cognitive load occurs when a learner's working memory is overloaded with information that is non-essential to the specific learning task, which has a negative effect on problem-solving (Leppink et al., 2013) . This often occurs through the instructional presentation of redundant or non-essential information and is therefore viewed as a result of a design flaw (Kalyuga et al., 1999) . Intrinsic load represents the complexity of the content based on the number of interacting elements of the problem in addition to the learner's prior knowledge of the content associated with the problem (Van Mer-ri€ enboer and Sweller, 2010). Problems with more interacting parts generally increase the intrinsic load and decrease the likelihood of practical problem-solving (Klepsch and Seufert, 2020) . Finally, germane load represents the successful effort to transfer information from working memory to long-term memory and is associated with successful problem-solving. Generally, instruction aims to change long-term memory, causing direct instruction's explicit approach to be in accordance. Long-term memory plays a vital role in cognition and has more than a peripheral role in problem-solving; hence, an explanation or instruction is deemed ineffective if it cannot be stored or reclaimed from the long-term memory with enhanced efficiency (Kirschner et al., 2006) . Explicit and clear explanations as part of direct instruction cause learners to recognize problem structure and interpret data efficiently (Kirschner et al., 2006) . Through this, direct instruction has been linked to successful learning in general and practical problem-solving specifically (Stockard, 2010; Stockard et al., 2018) . The mechanism behind such successful problem-solving brought about through explicit and clear explanations as part of direct instruction may be due to the use of instructional explanation under certain conditions where element interactivity may also be a factor. Element interactivity refers to the number of interacting parts associated with a specific problem (Lu et al., 2020) . Low element interactivity occurs when isolated parts of a problem are presented to learners (Chen and Kalyuga, 2020) . In this situation, the problem is considered less complex because the information in the isolated parts can be processed on its own (Chen and Kalyuga, 2020) . On the other hand, when many parts of a problem are integrated together in what is considered high element interactivity, the problem becomes more complex and often introduces higher levels of intrinsic cognitive load (Chen and Kalyuga, 2020) . Instructional explanations free from examples appear to help learners prepare for processing more complex content containing interacting elements. Chen et al. (2019) suggest that explicit instructional explanations help learners effectively process content containing multiple interacting parts in the problem-solving stage. Similarly, Paas et al. (2003) claim that pure instructional explanations lead to schema formation early on that can be applied to problem-solving that may contain highly interactive components later on. Kalyuga and Singh (2016) explain that background information of a particular topic presented as part of instructional explanations helps to avoid extraneous processing through means-ends-analysis or trial-and-error techniques that would otherwise be used in the problem-solving stage. Additionally, Gerjets et al. (2006) postulate that when learners do not have the required background knowledge to effectively self-explain the steps that are worked out in the form of examples, instructional explanation serves as a more valuable approach. It is important to note that the successful processing of information delivered through instructional explanations is a result only when the relayed information is exceptionally detailed to avoid misinterpretation of information that may arise from working memory overload (Lilian et al., 2019) . Although explanations as part of the direct instruction process have been shown to lead to successful problem-solving (Coughlin, 2014) , and that problem-solving through such instructional explanation reaches its maximum effectiveness when explanations are meticulously chosen in a systematic order (Lilian et al., 2019) , there are situations in which instructional explanations are not practical. For example, in the case of ambiguity or vague instructions, students are bound to make inefficient use of their time when solving a problem, which can hamper future development and progression (Stockard et al., 2018) . Additionally, such instructional delivery could lead to cognitive processing issues and, ultimately, an increase in cognitive load. For example, Ziegler and Stern (2016) showed that direct instruction alone could not reduce cognitive load and that example comparisons integrated within the direct instruction serve as a more likely cognitive load reducing approach. Without the implementation of examples as part of direct instruction, learners may have a tendency to resort to more passive learning and superficial processing of information (Berthold and Renkl, 2010; Chi, 2011) . This occurs when learners suppress their ability to make inferences since explicit information has already been delivered to them (Chi, 2011) , ultimately leading to issues regarding information transfer when problem-solving (Berthold and Renkl, 2010) . When compared to worked examples, relying on instructional explanation has been shown to reduce the amount of self-explanation undertaken by learners, ultimately negatively affecting learning (Schworm and Renkl, 2006) . Furthermore, Wittwer and Renkl (2010) found that the instructional explanations given in preparation for worked examples help with understanding concepts but do not necessarily help with applying those concepts to problem-solving. Previous research has shown that the direct instruction approach is negatively associated with developing critical thinking (Lekalakala-Mokgele, 2010; Kek and Huijser, 2011) , and is, therefore, more suitable for the development of lower-order skills such as memorization of facts, the study of algorithms, and action on the model (Schunk, 2012) . Such learning does not require students to think about the concepts that are "delivered" to them by instructors. From a pure example-based instructional point of view, worked examples portray an instructional method that offers step-by-step guidance on how to solve a particular problem (Kirschner et al., 2006; Wittwer and Renkl, 2010) . Self-explanation that occurs through the solution steps of worked examples allows learners to integrate existing knowledge to new concepts to help them learn how to solve problems (Chen et al., 2019) . Chen and Kalyuga (2020) explain that conceptually, worked examples are in line with Bandura (1986) borrowing and re-organizing principle, which indicates that learners imitate a set of problem-solving steps and, consequently, reconstruct the learning with current knowledge. Worked examples free up cognitive resources when solving problems (Wittwer and Renkl, 2010) . Cognitive load is reduced through the step-by-step delivery of worked examples, which allows learners to visualize how problems are solved and apply the newly learned information to solving similar problems (Chen and Kalyuga, 2020) . Subsequently, this leads to implementing newly-attained knowledge in novel problem-solving situations (Bandura, 1986; Van Gog et al, 2004; Wittwer and Renkl, 2010) . The study of worked examples has been proven effective for learning compared to problem-solving alone, as it prevents extraneous processing that may occur in the absence of worked examples when solving problems (Klepsch and Seufert, 2020; Retnowati et al., 2017) . Additionally, the guidance given to learners in the early stages of worked examples has led to effective problem-solving free from instructional guidance in the final stages of the learning process (Chen et al., 2019) . The mechanism of worked examples that allows for more efficient processing of information is generally tied to self-explanation. Rather than simply mirroring the solution steps provided in worked examples, learners can internalize why those steps lead to a particular solution (Renkl, 2014) . Through self-explanation, learners are able to justify the solution steps by integrating previously learned information, allowing them to apply the same concepts to different problems containing similar content (Chen et al., 2019) . For example, in a critical thinking class, a student may learn how to identify parts of a specific argument by self-explaining the steps taken to identify those parts and then be able to apply those same concepts to identify the various parts of a completely different argument. Using such self-explanation allows learners to avoid extraneous processing attributed to ineffective study strategies that ultimately interfere with solving the problem (Van Gog et al., 2004) . Because there are a variety of factors that influence the effectiveness of worked examples, there are several design issues that need to be considered. Overall, worked examples must be adequately planned to simplify knowledge construction (Atkinson et al., 2000) . Additionally, information should be obtainable in a form that logically contributes to students' knowledge construction and organization. For authentic learning from worked examples to occur, students must dynamically process the information accessible as a decisive step (Nainan and Balakrishnan, 2019; Renkl, 2014) . Furthermore, since worked examples characteristically contain only product-oriented information (i.e., the explanation step and the result), they are not predominantly meaningful in enabling the acquisition of expressive and bendable knowledge, which is reflective of critical thinking (Van Gog et al., 2004) . As such, when students are given worked examples, they may be more likely to focus on the minute, non-essential part of the problem rather than the fundamental concept underlying the problem (Lilian et al., 2019) . It should be further noted that the influence of worked examples is dependent on the amount of interacting components the problems have (Chen et al., 2016) . Specifically, when worked examples are overly complex, as is often the situation with examples containing high element interactivity, Renkl (2002) argues that the generation of effective self-explanation becomes less likely, rendering instructional explanations more effective for problem-solving in such situations. Other issues involving self-explanation occur because some learners are not necessarily prone to self-explain the solution steps without being prompted to do so, and therefore instructional prompts may be needed to ensure effective use of worked examples (Renkl, 2014) . Additionally, learners may face difficulties with more complex content and misinterpret the meaning behind the actual solution steps in worked examples, resulting in failure to grasp a deeper conceptualization (Renkl, 1997; Wittwer and Renkl, 2010) . All of these issues associated with the presentation of worked examples need to be considered depending on the specific problem-solving context. In today's society, the question of new types of skills students need is extremely relevant. In the labor market, some professions are automated, and universal competencies, including critical thinking, are becoming more and more in demand (Gruzdev et al., 2018) . Despite recognizing the importance of critical thinking development, recent research shows that students are making relatively modest progress towards developing universal competencies, including critical thinking, or in some cases, even stagnating (Arum and Roksa, 2011; Loyalka et al., 2021) . It is important to note that critical thinking is not an innate ability but an acquired skill that needs to be developed. While some students may be naturally inquisitive, they require training to become systematically analytical, fair, and unbiased in their pursuit of knowledge (Snyder and Snyder, 2008) . In this regard, it is vital to research practices that promote critical thinking development as one of the key learning outcomes of students. Despite more than a century of research on critical thinking in education, there is no consensus in the modern scientific and educational community as to which skills and character traits are part of the complex construct called critical thinking (Ennis, 1985; Facione, 1990; Halpern, 1998; Liu et al., 2014; Scriven and Paul, 1987) . At the same time, many authors emphasize the ability to work with arguments as one of the key components of critical thinking (Ennis, 1985; Facione, 1990) . This study will understand critical thinking as purposeful, self-regulating judgment, reflected in the interpretation, analysis, assessment, inference, and explanation of the evidence-based, conceptual, methodological, and contextual assumptions on which this judgment is based (Facione, 1990) . According to researchers of critical thinking, analysis is the part of critical thinking that allows recognizing the structure of arguments. Evaluation is a critical thinking skill for assessing the reliability, relevance, logical strength of an argument and deciding whether an argument is strong or weak. Inference includes collecting reliable, relevant, and logical information based on previous analysis and assessment of available evidence (Facione, 1990 ). It is necessary to select a unique instructional design to develop critical thinking (Snyder and Snyder, 2008) . Many studies are examining the most influential critical thinking development practices (Tiruneh et al., 2014; Abrami et al., 2015) . This research is also aimed at examining instructional design that is most suitable for the development of critical thinking. An essential feature of this study is the orchestration of the relationship between example-free direct instruction and the presentation of worked examples free from any instructional explanation and the ability to work with arguments as a key skill in critical thinking. While critical thinking has been selected as an example of one of the key educational outcomes, the authors of this study suggest that the findings can be used to develop other skills as well. Because instructional explanations and worked examples have often been examined as an integrated whole, separating the two should clarify the individual instructional techniques that contribute to successful problem-solving. While no research has been found that explicitly examines the effects of pure explanations compared to the effects of pure worked examples, the literature provides conceptual explanations as to the efficacy of each. Additionally, there is some tension in the research in that some research favors instructional explanations over worked examples for problem-solving and vice versa. Thus, this study aims to add to existing research by examining the effects of what we call example-free explanations compared to the effects of worked examples free from any instructional explanations. Additionally, this study seeks to examine germane load and intrinsic load levels, which may help explain the findings from a cognitive load perspective. As such, the following research questions are examined in this study. The first step of this research was to do a thorough review of extant research on worked examples and direct instruction. From the research, it became apparent that extant studies treat direct instruction and worked examples as an integrated whole. Furthermore, the term direct instruction has generally been used loosely to represent any kind of explanation, even if it integrates worked examples into the explanation. To fill these gaps, this study examines a specific form of direct instruction, "examplefree instruction" in opposition to worked examples free from instructional explanations. This was done to allow a more granular examination of factors that induce high-performance levels in students' problemsolving. In terms of the development of materials, the decision was made to focus on the different effects of worked examples and example-free instruction among a group of students (n ¼ 32) studying critical thinking in the liberal arts department of a university in South Korea during the spring semester of 2020. Of the total number of participants, 25 were female, and 7 were male. The average age of the participants was 22.4, with a standard deviation of 2.8. The classes associated with the course of study focused on the use of critical thinking to analyze, evaluate, and construct arguments. This subject was selected due to the amount of element interactivity associated with solving problems from a critical thinking perspective, as the current study includes problem-solving through argument analysis containing multiple interacting elements. Although initially designed for face-to-face learning, due to the COVID-19 pandemic of 2020, the class examined in this study took place during real-time online lectures using the Zoom platform. As part of the in-class work and participation score for the class, the students participated in weekly activities that focused on the following topics: identifying claims and non-claims, differentiating between explanations and arguments, identifying various parts of the structure of arguments, and differentiating between arguments with multiple conclusions and chain arguments. These topics and the activities associated with them serve as the context for the present study. The independent variables of this study include both example-free instruction and worked examples. In order for both types of content delivery (example-free instruction and worked examples) to be represented among all four topics of this study (claim vs. non-claims, arguments vs. explanations, argument structure, and arguments with multiple conclusions vs. chain arguments), four example-free instruction videos (one for each topic) were made, and a series of worked examples were made for each of the four topics. The example-free instruction videos were relatively short videos of three to six minutes in length. They were made up of direct instruction containing essential explanations of how to solve problems relating to the topic by introducing descriptive definitions of terms and how to apply the newlygained knowledge of those terms to solving critical thinking problems. It is important to note that much care was taken not to include any examples as part of the instruction within the explanations. For example, as part of the goal associated with the topic of argument structure, the students needed to learn how to identify the premise and the conclusion of an argument. In this situation, the term "premise" was explained as a claim that serves as a reason or evidence that supports the conclusion, and the term "conclusion" was explained as the final outcome of an argument that is supported by the premise. There were no statements given that could serve as an example of a specific premise or a specific conclusion used in an argument. Conversely, the worked examples contained no direct instruction. They only focused on providing students with examples of problems based on a specific topic being studied and the solution steps to solving those problems. For each topic (claim vs. non-claims, arguments vs. explanations, argument structure, and arguments with multiple conclusions vs. chain arguments), a series of ten problems were designed, the results of which serve as the dependent variable of this study. The experiment took place in the context of the participants' regularly scheduled classes using the Zoom platform. The participants had no prior instruction on the topics covered in the experiment. During a scheduled class of the participants of this study, two separate links were posted in the form of a group chat message in the Zoom class. Half of the participants clicked on a link that connected them to a Google Form in which they received content for the first two topics (claims vs. non-claims and arguments vs. explanations) via the example-free instruction videos and the content for the remaining two topics (argument structure and arguments with multiple conclusions vs. chain arguments) via worked examples. The other half of the participants clicked on a separate link that connected them to a Google Form in which they received content for the first two topics (claims vs. non-claims and arguments vs. explanations) via worked examples and the content for the remaining two topics (argument structure and arguments with multiple conclusions vs. chain arguments) via example-free instruction videos. Upon completing either the examplefree instruction videos or worked examples, all of the participants were asked to solve the problems associated with the topics they just learned. Their scores were collected, and the data were used for analysis to determine if there was a difference in problem-solving scores depending on receiving either example-free instruction or worked examples. Because the experiment that was conducted as part of this study was not focused on the differences between a control group and an experimental group, there was no need to use pre-tests or post-tests to determine which group improved more based on whether they received an experimental intervention or not. Every participant of this study received both conditions (example, free instruction and worked examples), and therefore the differences in the problem-solving results from varying conditions are used for analysis. Furthermore, the differences in cognitive load based on those same varying conditions are also used for analysis. A similar experimental design has been used in previous research, including Lange et al. (2016) . Although cognitive load theory contains three elements (extraneous load, intrinsic load, and germane load), this study chose only to examine intrinsic load and germane load. This was done to identify the levels of complexity the learners perceived the various instructional conditions to be and identify their comprehension levels based on those same instructional conditions. The complexity of the content has been linked to problem-solving outcomes (Van Merri€ enboer and Sweller, 2010), and therefore this study seeks to examine intrinsic load levels. While the extraneous load is tied to poorly designed instruction, which can have an effect on problem-solving, the current study is more interested in the complexity of the content rather than the clarity of the instruction. It should be noted that in both conditions, only information relevant to solving the problem was delivered. Extreme care was taken not to present any extraneous information. Care was taken to make sure the instruction was simple and straightforward, only delivering information related to the specific topics. Justification for examining only the elements of cognitive load theory relevant to a particular study at hand is provided by several studies, including Hughes et al. (2021) . To determine the intrinsic load and germane load levels based on both example-free instruction and worked examples, subjective measurements were obtained through survey analysis. Following the problem-solving phase in response to either example-free explanations or worked examples, the participants were asked a set of three questions used to determine their levels of intrinsic load and a set of four questions used to determine their levels of germane load based on the specific instruction they just received. The items used for both intrinsic load and germane load measurements were adapted from Leppink et al. (2013) , who originally developed the items as a way to measure various aspects of cognitive load. Justification of the use of the instrument is provided by Leppink et al. (2013) in that they conducted factor analysis which showed that all cognitive load elements loaded separately and therefore represented vital factors. Both cognitive load items used in the current study were measured using a Likert-type scale that ranged from 0 to 10, with 0 being represented as strongly disagrees and ten being represented as strongly agree. This scale is in accordance with the original one developed by Leppink et al. (2013) , which was also set at 0 to 10. The three items used for intrinsic load measurement based on the example-free instruction are as follows: 1) The topics covered in the lecture were very complex, 2) The lecture covered information that I perceived as very complex, and 3) The lecture covered concepts and definitions that I perceived as very complex. The Cronbach's alpha for the intrinsic load construct pertaining to the example-free instruction was found to be an acceptable .941. The same three items were used for the intrinsic load measurement based on the worked examples, with a minor modification of the word "lecture" in each item being replaced with the word "examples". The Cronbach's alpha for the intrinsic load construct pertaining to the worked examples was found to be an acceptable .962. The four items used for the germane load measurement based on the example-free instruction are as follows: 1) The lecture really enhanced my understanding of the topic, 2) The lecture really enhanced my knowledge and understanding of the of the class subject, 3) The lecture really enhanced my understanding of the concepts associated with the class subject, and 4) The lecture really enhanced my understanding of concepts and definitions. The Cronbach's alpha for the germane load construct pertaining to examplefree instruction was found to be an acceptable .941. The same four items were used for the germane load measurement based on the worked examples, with a minor modification of the word "lecture" in each item being replaced with the word "examples". The Cronbach's alpha for the germane load construct pertaining to worked examples was found to be an acceptable .921. In order to answer the first research question, an analysis was performed to find out if there was a difference in problem-solving scores based on receiving either example-free instruction or worked examples. T-testing was used to determine the difference in scores between the example-free instruction condition and the condition of the worked example. Within the example-free instruction condition, students had an average problem-solving score of 7.36, while within the condition of the worked example, students had an average problem-solving score of 6.37 (Table 1 ). As shown in Table 1 , there was a significant difference in problem-solving scores based on either example-free explanation or worked examples regarding the students' scores in problem-solving. To answer the second research question, the intrinsic load levels of both the example free instruction and worked examples were compared. As can be seen in Table 2 , there was no statistically significant difference between the groups. The example free instruction condition had a mean intrinsic cognitive load score of 4.21, and the condition of the worked example had a mean intrinsic load score of 4.34. Based on these results, students did not perceive different levels of intrinsic load based on the experimental condition. Germane load means of all the participants were calculated, and comparisons were made of germane load levels between example-free instruction and worked example instruction. As can be seen in Table 3 , the overall germane load difference favors example-free instruction (p ¼ .04) with a germane load mean score of 7.88 when participants received the example-free condition compared to a germane load mean score of 6.61 when participants received the worked examples condition. Based on these results, when students received example free explanations, their levels of comprehension were higher than when they received examplefree explanations. While much research exists showing the effects of worked examples as part of instructional explanations (Schwartz et al., 2011; Schalk et al., 2020) , the current study separated each variable in an attempt to examine particular aspects of the process. Creating two conditions, one in which critical thinking instruction is delivered via example-free instruction and one in which critical thinking instruction is delivered via worked examples with no instructional explanations, this study found that overall, students were more successful in regard to problem-solving when they received example-free instruction. Additionally, the participants showed no difference in intrinsic load levels based on the type of instruction received. This suggests that the participants of this study viewed the content of both example-free instruction and worked examples to be at a similar level of complexity. Furthermore, germane load levels examined in this study support the main results in that germane load, which generally reflects reduced levels of negative aspects of cognitive load (Klepsch and Seufert, 2020) , was generally higher under the example-free instruction. This makes sense as the reduction of cognitive load is often used in such instructional situations as an explanation of how effective problem solving is likely to occur. Extant research supports the notion that a number of variables affect problem-solving, as such research considers worked examples as part of direct instruction (Atkinson et al., 2000) . However, the present study was able to show that direct instruction free from any worked examples leads to greater success in problem-solving. Upon initial examination, the results may be interpreted as providing instructional explanations free from examples is more effective for the problem-solving than presenting worked examples free from instructional explanations. But when we look further into the findings, the explanation is more complicated than that. Specifically, element interactivity and the subject matter of the content in the current study may explain why example-free instruction produced higher problem-solving results than worked examples. For the most part, the content in this study contained multiple interacting elements, and in general, analyzing arguments through critical thinking is not considered to be surface-level learning (Van Gog et al., 2004) . Therefore, when using critical thinking to analyze arguments with multiple elements, clearly explaining the content may be more appropriate than providing worked examples free from any explanation. Generally speaking, explaining concepts and ideas rather than demonstrating solution steps through worked examples has been promoted for various reasons, including providing detailed information to avoid extraneous processing during problem-solving (Gerjets et al., 2006) . When you look at the current study, such explanations apparently became more useful than worked examples when it comes to solving critical thinking problems with multiple interacting parts. The current study results can further be explained in that clearly explaining abstract theories and ideas and making understandable conclusions helps to increase educational results, including critical thinking (Loes et al., 2015; Feldman, 1989) . In this way, teachers help students understand the basic concepts of the subject, in this case, analyzing arguments through critical thinking to help them effectively solve problems. From a cognitive load perspective, problems containing multiple interacting parts require more processing within the short-term memory (Lu et al., 2020) , thus adding cognitive load and making it more challenging to transfer information to the long-term memory. It has been suggested that self-explanation, perhaps the most critical aspect of processing worked examples, may no longer be effective if there is a tipping point where the example is too complicated to self-explain (Renkl, 2002) . Additionally, it has been postulated that students would benefit more from instructional explanations (Renkl, 2002) . In the present study, it could be the case that given the nature of argument analysis using critical thinking and the multiple interacting elements associated with it, the participants were able to comprehend the material better when specific ideas and concepts were explained to them. This is opposed to being presented with worked examples, which may have created too much of a cognitive processing burden on the participants to self-explain the steps in the worked examples due to the nature of the content. This would verify what Renkl (2002) suggestion that problems containing multiple interacting elements need some form of instructional explanation for effective processing to occur. This study allows us to examine the individual effects of both pure explanations and pure worked examples. Because worked examples are often examined as part of instructional explanations, this study allowed us to see the effects of the individual instructional parts. A further contribution to this study is that while some research has claimed that instructional explanations may be more effective than worked examples when attempting to solve problems containing multiple interacting elements, this study provides empirical support for that notion. Instructors need to be aware that when presenting complex problems that may overload cognitive processing through multiple connecting elements, pure explanations may be better to provide needed detail to enhance problem-solving, as self-explanations may become ineffective at this point. While these results tell and add to the current research field, further follow-up focusing on subjects other than critical thinking would be helpful in determining how much instructional practices affect problem-solving in other contexts. Nevertheless, we do know that within a critical thinking context, it appears that explaining argument structure and analysis with multiple interacting elements would be better served for problem-solving if instructional explanations are provided rather than worked examples. Author contribution statement Christopher Lange: Conceived and designed the experiments; Wrote the paper. Jamie Costley: Performed the experiments; Wrote the paper. Yuliya Koreshnikova: Contributed reagents, materials, analysis tools or data. Norah Almusharraf: Contributed reagents, materials, analysis tools or data; Wrote the paper. This work was supported by Prince Sultan University (CH203). Data will be made available on request. Strategies for teaching students to think critically: a meta-analysis Academically Adrift: Limited Learning on College Campuses Learning from examples: instructional principles from the worked examples research Social Foundations of Thought and Action How to foster active processing of explanations in instructional communication Cognitive load theory, spacing effect, and working memory resources depletion: implications for instructional design Relations between the worked example and generation effects on immediate and delayed tests Learning from worked examples, erroneous examples, and problem solving: toward adaptive selection of learning activities Theoretical perspectives, methodological approaches, and trends in the study of expertise Outcomes of engelmann's direct instruction: research syntheses Direct Instruction. Educational Technology Publications Research from the inside: the development and testing of DI programs The direct instruction follow through model: design and outcomes A logical basis for measuring critical thinking skills Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction (The Delphi Report) The association between student ratings of specific instructional dimensions and student achievement: refining and extending the synthesis of data from multisection validity studies Can learning from molar and modular worked examples be enhanced by providing instructional explanations and prompting self-explanations? University graduates' soft skills: the employers' opinion Teaching critical thinking for transfer across domains: disposition, skills, structure training, and metacognitive monitoring The relationship between attention and extraneous load Rethinking the boundaries of cognitive load theory in complex learning Managing split-attention and redundancy in multimedia instruction The power of problem-based learning in developing critical thinking skills: preparing students for tomorrow's digital futures in today's classrooms Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching Understanding instructional design effects by differentiated measurement of intrinsic, extraneous, and germane cognitive load Informal cooperative learning in small groups: the effect of scaffolding on participation Facilitation in problem-based learning: experiencing the locus of control Development of an instrument for measuring different types of cognitive load Social studies curriculum and cooperation among Preschool learners in Nairobi COUNTY, Kenya: addressing effectiveness of instructional methods Assessing critical thinking in higher education: current state and directions for next generation assessment Student perceptions of effective instruction and the development of critical thinking: a replication and extension Skill levels and gains in university STEM education in China, India, Russia and the United States Altering element interactivity and variability in example-practice sequences to enhance learning to write Chinese characters Understanding Generation Z students to promote a contemporary learning environment Design and evaluation of worked examples for teaching and learning introductory programming at tertiary level. Malays Cognitive load theory and instructional design: Recent developments Learning from worked examples: a study on individual differences Worked-out examples: instructional explanations support learning by self-explanations Toward an instructionally oriented theory of example-based learning Can collaborative learning improve the effectiveness of worked examples in learning mathematics? Providing worked examples for learning multiple principles Learning Theories, an Educational Perspective Practicing versus inventing with contrasting cases: the effects of telling first on learning and transfer Computer-supported example-based learning: when instructional explanations reduce self-explanations Critical thinking Teaching critical thinking and problem solving skills Promoting reading achievement and countering the "fourth-grade slump": the impact of Direct Instruction on reading achievement in fifth grade The effectiveness of direct instruction curricula: a meta-analysis of a half century of research Effectiveness of critical thinking instruction in higher education: a systematic review of intervention studies Process-oriented worked examples: improving transfer performance through enhanced understanding Cognitive load theory in health professional education: design principles and strategies How effective are instructional explanations in examplebased learning? A meta-analytic review Consistent advantages of contrasted comparisons: algebra learning under direct instruction The authors declare no conflict of interest. No additional information is available for this paper.