An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks

 
CONTINUE READING
An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks
An investigation in assessments for distance learning in the Secondary
                        Mathematics Classroom.

                            By Jessica Britton

            A Dissertation Submitted in Partial Fulfillment of the
                      Requirements for the Degree of

                             Masters in Science
                                Mathematics
                                    at the

                California State University, Channel Islands

                                 May 2021
An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks
©2021
    Jessica Britton
ALL RIGHTS RESERVED

          2
An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks
MS Thesis in Mathematics of Jessica Britton

APPROVED FOR THE MATHEMATICS PROGRAM

                                     05/25/2021

      Dr. Ivona Grzegorczyk
                                     05/24/2021

      Dr. Jorge Garcia
                                     05/25/2021

       Dr. Brooke Ernest

      APPROVED FOR THE UNIVERSITY

                                      05/25/2021

   Interim Dean Dr. Jill Leafstedt
An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks
Non-Exclusive Distribution License

In order for California State University Channel Islands (CSUCI) to reproduce, translate and
distribute your submission worldwide through the CSUCI Institutional Repository, your agreement to
the following terms is necessary. The author(s) retain any copyright currently on the item as well as
the ability to submit the item to publishers or other repositories.

By signing and submitting this license, you (the author(s) or copyright owner) grants to CSUCI the
nonexclusive right to reproduce, translate (as defined below), and/or distribute your submission
(including the abstract) worldwide in print and electronic format and in any medium, including but not
limited to audio or video.

You agree that CSUCI may, without changing the content, translate the submission to any medium
or format for the purpose of preservation.

You also agree that CSUCI may keep more than one copy of this submission for purposes of
security, backup and preservation.

You represent that the submission is your original work, and that you have the right to grant the
rights contained in this license. You also represent that your submission does not, to the best of
your knowledge, infringe upon anyone's copyright. You also represent and warrant that the
submission contains no libelous or other unlawful matter and makes no improper invasion of the
privacy of any other person.

If the submission contains material for which you do not hold copyright, you represent that you have
obtained the unrestricted permission of the copyright owner to grant CSUCI the rights required by
this license, and that such third party owned material is clearly identified and acknowledged within
the text or content of the submission. You take full responsibility to obtain permission to use any
material that is not your own. This permission must be granted to you before you sign this form.

IF THE SUBMISSION IS BASED UPON WORK THAT HAS BEEN SPONSORED OR SUPPORTED
BY AN AGENCY OR ORGANIZATION OTHER THAN CSUCI, YOU REPRESENT THAT YOU
HAVE FULFILLED ANY RIGHT OF REVIEW OR OTHER OBLIGATIONS REQUIRED BY SUCH
CONTRACT OR AGREEMENT.

The CSUCI Institutional Repository will clearly identify your name(s) as the author(s) or owner(s) of
the submission, and will not make any alteration, other than as allowed by this license, to your
submission.

An investigation in assessments Aor distance learning in the secondary Mathematics
Classroom
Title of Item
A Dissertation Aor the degree of mathematics
3 to 5 keywords or phrases to describe the item
  dessica Britton
Author(s) Name (Print)

                                                                                                         6/25/2021
Author(s) Signature                                                                                         Date

                                                     This is a permitted, modified version of the Non-exclusive Distribution
                                                                  License from MIT Libraries and the University of Kansas.
An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks
Acknowledgments

First and foremost, I would like to thank Dr. Ivona Grzegorczyk for her efforts in helping and

supporting me through my journey on this research and in my journey as a mathematics educator.

The insight and experience she brings to the profession influenced me as an undergraduate

student and inspired me as a professional.

Thank you also to Dr. Jorge Garcia for challenging me in my thinking and showing me that some

of the best lessons are taught outside of a classroom. Dr. Garcia showed me that creating positive

relationships with your students will always yield the best results.

Lastly, I would like to thank my family, friends, and colleagues who have supported me in my

endeavor to complete this research. The support I received was critical to my success in the

project. A special thank you to my husband, Jermaine Britton, my sounding board and biggest

supporter. I could not have gotten through this time without him.

                                                  4
An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks
Abstract

       In this study, we investigate different forms of assessment tools used by three groups of

mathematics high school students during mandatory distance learning instructions. We compare

the three groups’ achievement and participation levels on various online assessment tools along

with completion rates on accompanying learning activities. We sought to determine which

activities are effective and can be successfully used for assessment and evaluating student

progress in the online learning environment. The study was conducted in three consecutive parts

and included analysis of the classroom data from learning activities, course assessments, and

student surveys obtained during three academic quarters during the COVID pandemic. Our

results yield the following conclusions: the technical ease or difficulty of the assessment is not

related to student achievement; more information about skills mastery is obtained during

synchronous activities when compared to asynchronous assessment;              activities requiring

explanations of student thinking give a better measure of the learning progress; fewer more

focused assessments yield more participation and better scores. Additionally, the vast majority of

both students and instructors support online assessment activities in a regular in-person

classroom. Our results are comprehensive and helpful for anyone designing an online curriculum

in mathematics.

Key Words: distance learning, digital assessments in mathematics, synchronous activities,

asynchronous activities.

                                                5
An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks
Table of Contents

Abstract                                         5

1. Introduction                                  7

    1.1        Background                        8

    1.2        Research Questions               10

    1.3        Participants                     11

2. Methodology                                  12

3. Objective                                    13

4. Data Collection Method                       17

5. Part 1                                       18

   5.1         Introduction                     18

   5.2         Description of Assessments       19

   5.3         Hypothesis testing               22

   5.4         Data Analysis                    27

   5.5         Part 1 Conclusions               33

6. Part 2                                       34

   6.1         Introduction                     34

   6.2         Hypothesis tests                 36

   6.3         Data Analysis                    37

   6.4         Part 2 Conclusions               42

7. Surveys                                      43

                                            6
An investigation in assessments for distance learning in the Secondary Mathematics Classroom. By Jessica Britton - ScholarWorks
8. Part 3                             51

   8.1       Introduction             51

   8.2       Hypothesis testing       52

   8.3       Data Analysis            55

   8.4       Surveys                  58

   8.5       Part 3 Conclusion        61

9. Conclusions                        62

10. Further Study                     63

References                            64

Appendices                            68

                                  7
1.      Introduction

       Since, in recent years, interest in online education has increased steadily, we wanted to

evaluate the methodology of remote delivery of high school-level mathematics curriculum. In

addition, the COVID-19 pandemic moved all instructions in California online and gave us an

unprecedented opportunity to analyze various assessment methods of student learning and their

participation in remote classrooms. In this study, we consider the following research questions: Is

it possible to effectively assess student progress and skills in mathematics using various distance

learning tools? Do our assessment tools need to change for better student success in the distance

learning model?

       In California, in March of 2020, the school year for students at every education level

ground to a halt. Students and teachers were forced out of in-person, in-classroom instructions

into online learning environments. We study the effects of this switch using the data collected

during the academic quarter in Part 1. As the fall term continued and there seemed to be no signs

of the pandemic slowing, most schools in California remained in the distance learning model for

the fall and winter terms of the 2020-2021 school year. Prior to this time in the history of

education, very few students enrolled in an online school did so at their choosing. This was an

unprecedented situation that provided us with an opportunity to further investigate data on

assessments in mathematics during online learning beginning in the winter quarter (see Part 2).

Additionally, we collected data presented in Part 3 in the next quarter. Both instructors and

students were experienced with online learning, familiar with software environments used and

various assessment tools.

       In this study, we investigate different forms of assessment for three 9-week academic

quarters of distance learning. We compare student achievement and participation levels on these

                                                 8
assessment tools and accompanying online learning activities (see Part 1, Part 2 & Part 3 for

each academic quarter). We acknowledge that many factors can determine the participation of

students in online classes, however, in our study, we limit our scope to completion rates for

activities and scores for specific types of implemented assessments.

                                        1.1 Background

       Many factors can influence a learners’ performance in any learning environment,

including their motivation, emotions, and cognitive process (Kim et al., 2014). In the study

conducted by Kim et al., they found that student's emotions were an indicator of their success in

online learning in a mathematics course. Their research indicates that the presence of peers and

the interaction with a teacher can reduce the emotional effect that a strictly asynchronous type, of

course, can cause.

       Motivation, in particular, is a critical factor in student success and must be considered

when creating content and assessments for any course of study (Ames, 1992). During distance

learning, motivation is a hurdle that both students and teachers will contend. There are many

distractions at home that can deter students from staying engaged during live synchronous class

meetings and activities. The difficulty level of using the digital resources and access to the

materials can also be an issue. Student engagement is a leading factor in a learner's success and

motivation to work (Barana et al., 2019). When students engage during a lesson or activity, they

are more likely to achieve good results and enjoy their time in class. Therefore, technologies that

educators use during distance learning should implement gamification, simulations, and

interaction to promote student engagement.

       Engagement should not stop at the learning activities but should also be a factor when

creating the assessments. Formative assessment strategies can help students engage with the

                                                 9
teacher and other students in the class. When creating a digital assessment, one should consider

that certain multiple-choice and fill-in answer questions do not provide evidence that learners

used an appropriate method or thinking to solve problems (Sangwin & Kocher, 2016). Although

following the above research, we administered these forms of assessments in our study. We also

used other assessment types where students can demonstrate their thinking and understanding of

the concepts.

       Over the past decade, there has been a significant shift in mathematics education in the

United States (Shoenfeld, 2015). Adopting the California Common Core State Standard for

Mathematics in 2010 (CCSSM) has shifted the focus from procedural exercises to cultivating

students' ability to think in a more complex mathematical way. Therefore, it is natural that the

assessment of this new way of teaching and learning should also change. Most assessments from

the not so distant past requiring students to follow a set of rules to solve a problem, currently can

be attempted with online tools and phone apps. Hence, teachers must then be more creative with

their assessments to ensure that they are getting an accurate understanding of a student's

mathematical abilities and their thinking processes.

       There are two major forms of assessment, summative and formative. Formative

assessments are less formal and provide teachers with information on what their students know

about any given topic related to the current lesson content. A formative assessment can be

anything from a one-question quiz to a verbal response during a classroom activity. The

formative assessment provides teachers with a snapshot of where their students are in their

progress toward mastery. Effective instructors use these formative assessments to drive the

learning experience. Summative assessments are more high stakes as they cover more topics and

are typically given at the end of a unit and follow a rubric or benchmark. Large-scale tests such

                                                 10
as the SAT, ACT, and state tests are examples of end of high school cumulative summative

assessments.

       What is assessed and how it is assessed becomes the framework for instruction. Research

shows that digital tools in the mathematics classroom have enhanced the learning experience for

all students (Hillmayr et al., 2020). The use of these tools can also be applied to assessments,

both formative and summative.

                                  1.2 Research Questions

       Our study focuses on various methodologies for the assessment of student progress in the

online learning environment. We especially look at the following questions:

Can student skills mastery be assessed during distance learning using various digitalforms of

assessment?

Does the type of assessment methodology affect the student performance and participation

during distance learning?

Which activities are best adopted as assessment tools for evaluating student progress in

mathematics in an online learning environment?

       These questions were studied in Parts1,2, and 3 by analyzing the classroom data obtained

during three academic quarters of the COVID pandemic from learning activities, assessments,

and a student survey.

                                       1.3 Participants

       Our participants were students at a California high school in grades 9-12 enrolled in

mathematics courses Math 1 (covering introductory algebra and geometry topics), Math 2

(algebra, geometry, and statistics topics), and Math 3 (algebra and trigonometry topics). All

                                               11
students were placed in the proper course based on their previous grades and teacher

recommendations. This high school enrolls approximately 2,000 students. The student body is

predominantly    Hispanic (92.1%), with the remaining population being African American,

White, Pacific Islander, Filipino and Asian. In addition, 60% of the student population is

classified as fluent English Proficient, 16.5% as English Learner, 20% as English only, and 1.3%

as Initial fluent English proficient (IFEP). IFEP students demonstrate advanced proficiency in

English on their initial CELDT(California English Language Development Test) testing. (CDE

Data 2019-2020 school year). Furthermore, 80.1% of the student body qualifies for free or

reduced lunch as of the latest statistics available from the 2018-2019 school year (NCES 2019).

All students have been assigned a laptop for school use. All grade 9 and 10 students have

touchscreen Chromebooks, and the majority of the juniors and seniors have an older version of

Chromebooks that are not a touchscreen.

                                          2.     Methodology

       Our data collection for this research was conducted during the three academic quarters of

the 2020-2021 school year at a California high school, and we refer to each step as Part 1, 2, or 3.

All courses were taught using an integrated curriculum based on the textbooks Core Connections

Integrated Math 1, 2, and 3. Meaning that algebra skills, geometry, and statistics are all taught

within each level, specifically Math 1 (introductory algebra and geometry topics), Math 2

(algebra, geometry, and statistics topics), and Math 3 (algebra and trigonometry topics). All

participants agreed to the use of their assignments and scores in our research project.

                                                 12
Part 1 of the study covered the first academic quarter of the year, which was 9 weeks long,

during which the unexpected beginning of distance learning for all students started. We collected

data from two groups of students enrolled in Math 2 and Math 3. The assessment data were

obtained from 14 Math 3 students and 14 Math 2 students.

Part 2 of our study corresponded to the second academic quarter of the year and was also 9

weeks long. We collected data from 17 Math 3 and 13 Math 2 students. Note that thirteen

participants from Part 1 were also participants in Part 2 of the Math 3 group, with only 4 new

participants added. In the Math 2 group, we had 10 participants from Part 1 with 3 new

participants for Part 2.

A survey related to online learning experiences was given at the end of Part 2 to a group of all

Math 2 and Math 3 students in this high school who also completed the same assessments as our

study groups. We had 173 respondents from Math 3 and 54 from Math 2 courses. See Appendix

F.

A survey was also given to the teachers of all three groups to gauge their experiences with the

teaching methodologies and assessments during distance learning. See Appendix G.

In Part 3, we collected data from our study group that included students from two Math 1

courses, one meeting at the beginning of the day (8:30 am) and the other at the end of the school

day (2:00 pm). We refer to the morning group as the AM group and the afternoon group as the

PM group. Part 3 corresponded with the third academic quarter of the school year, which began

in January and concluded in April. The quarter was 10 weeks in length. We collected data from

33 participants in the AM study group and 29 students in the PM study group. Note that there

were 58 ninth graders, 3 tenth graders, and 1 eleventh grader participating in Part 3.

                                                 13
A survey was given to the Math 1 students in our study group and to a larger group of all students

in Math 1 at the high school where our study was conducted. See Appendix I.

                                            3.        Objective

       The study was conducted during regular mathematics class sessions conducted online by

high school teachers. Students met with the instructor on Google Meet video system for 5 hours

a week for lectures and activities. Then they were assigned various online assessments to be

completed by the end of the week. The objective of this study was to investigate different forms

of assessments given to integrated Math 1, 2, and 3 courses during distance learning. The

assessment types included a student-made video, instructor video with embedded questions, free

response assessment using online tools for submission, multiple-choice assessments, and

interactive Google Slides with the Peardeck extension.

Here are descriptions of all tools used in the study:

Google Slides - Online app used to create lessons in a presentation format. Google Slides were

the method used to deliver instruction to the students. See figure 1.

Link: https://www.google.com/slides/about/

Peardeck- An add-on for Google Slides that convert a slide into an interactive tool to engage

students during learning. The feature used for our purposes of assessment was the draw tools that

made any slide a whiteboard. See Figure 2

Link: https://www.peardeck.com/googleslides

                                                 14
Figure 1. Google slide

                                         Figure 2. Peardeck slide

Canvas Quiz- Canvas is a learning management system that has a feature that allows instructors

to create assessments in varying formats. In addition, the Canvas platform can be used to create

question banks for any topic.

Link: https://www.canvas.net/

DeltaMath- An online site where students can practice skills and use instructional videos and

examples to understand mathematical concepts further. See Figure 3

Link: https://www.deltamath.com/

Flipgrid- An online site where students can record responses to a prompt and engage with each

other and their instructor. The site allows students to record their screens while simultaneously

recording themselves speaking. The videos are stored on the site and shared with a group or

private to the student and instructor.

Link: https://info.flipgrid.com/

                                                   15
Figure 3. DeltaMath Problem example

Flipgrid- An online site where students can record responses to a prompt and engage with each

other and their instructor. The site allows students to record their screens while simultaneously

recording themselves speaking. The videos are stored on the site and shared with a group or

private to the student and instructor.

Link: https://info.flipgrid.com/

Kami- A Chrome extension that allows annotation on any document in PDF format and saves

the annotations. There are many features that Kami provides. For our purposes, we utilized the

draw tools and the equation editor. See figure 4.

Link: https://www.kamiapp.com/

                                   Figure 4. Kami worksheet and extension

                                                    16
Jamboard- A Google product that creates a whiteboard space for drawing. As with many of the

Google tools, the Jamboard can be shared for collaboration activities. See Figure 5

Link: https://jamboard.google.com/
                    -   Untitled Jam                                                <   | 1/1 |1   >

                         C*            •   Set background   Clear frame

                                                                          Figure 5. Jamboard

Edpuzzle-An online site where videos can be used as learning tools for instruction and

engagement by adding embedded questions in a free-response or multiple-choice format. See

Figure 6.

Link: https://edpuzzle.com/

       Students participated in lectures and activities with their instructors. They were assigned

online pre-assessment activities and were provided with a grading rubric before each final

assessment. The survey was administered after all assessments had been completed.

                                                      Figure 6. Edpuzzle video and questions

                                                                                 17
All Math 2 and 3 course teachers were surveyed and interviewed to gauge their experiences with

the types of assessments used during distance learning covered by our study. See Appendix F-I.

                                    4.     Data Collection Method

        Since, from the start, participation in online instructions was an issue, we decided to use

the data from the online pre-assessment activities to compare with the scores on our online

assessments. In each part of the study, out of several activities offered to students, only two were

chosen for each week's lessons leading up to the online assessment. Each assessment covered a

core topic from the Math 1, 2, and 3 course curriculum and was based on the online lesson,

activities, and pre-assessments. All of the activities were graded on a 5-point scale. Note that

only students with no submission received a score of 0 on the activities and assessments. We

used Google sheets as the statistical software to visualize and analyze the data collected from

participants.

        Data collected from the Peardeck slides where students had to show their work was

graded by teachers on a 5-point scale. The surveys evaluating participants' experiences with

online learning were created using a Google Form, and the instructors and students completed

their surveys at the end of Part 2 during finals week.

                                                 5.        Part 1

                                           5.1         Introduction

        There were many obstacles in Part 1 when the sudden mandatory switch to online

instructions happened as both the instructors and the students were unfamiliar with the new

software used by the school for distance learning. Canvas and Google Meet were the primary

                                                      18
platforms for delivering content, assignments, and communication. Before the 2020-2021 school

year, teachers at the high school used Google Classroom as their learning management system

(LMS). However, for various reasons, the Mathematics Department as a team opted to use

Canvas for all classes. The Canvas LMS was unfamiliar, hence challenging to navigate compared

to Google Classroom, especially for untrained users. Another issue that both the instructors and

the students struggled with was Internet connectivity and broadband problems. The school gave

all students a Chromebook and, if they opted, could also check out an internet hotspot for use in

their homes. However, the hotspots originally issued to students were limited on the amount of

data and had to be replaced within the first four weeks of school with better ones.

       At the beginning of the academic quarter, teachers in all three levels used various tools to

deliver instruction in an inquiry-based format. Desmos activities, Google Slides, Jamboards, and

DeltaMath were some of the programs implemented in those beginning weeks. For equity

purposes, the Mathematics Department always used a common curriculum and common

assessments for all classes. In keeping with the policy of common curriculum, the content leads

in Math 1, 2, and 3 designed Google Slides with the Peardeck add-on extension to deliver the

instruction material. The Peardeck extension turns a Google slide presentation into an interactive

space where the instructor asks open-ended and multiple-choice questions and turns some slides

drawing slides. Instructors can see student work in real-time using the instructor dashboard, and

student responses can be seen in the Google Meet using the presentation screen.

       In Part 1 of our research, we used several different methods to assess student learning.

Since the Mathematics Department at the high school decided to use common assessments, we

were permitted to create assessments used in both the Math 2 and 3 courses for all sections. We

designed four methods of assessment for Part 1 of the study: a video assessment using Flipgrid, a

                                                19
free-response assessment using Kami and Jamboard, a free-response assessment using Peardeck,

and a multiple-choice assessment using Canvas quizzes. There was not a particular synchronous

time that the assessment had to be completed. Instead, the assessments could be completed by

students before the set deadline whenever they decided they were ready. It may be the case that

this lack of a timeline proved to give participants too much freedom and led to many of them not

completing the first couple of assessments by their due date, particularly those in the Math 2

courses. Overall, we found that students in the Math 3 course completed activities and

assessments at a higher rate than the students at the lower Math 2 level. To increase participation

for both groups, we assigned a specific Google Meet time to complete the assessments that all of

them had to attend. All assessments in Part 1 were given at the end of the week on Friday during

the regularly scheduled class period.

                               5.2      Description of Assessments

       Here we discuss shortly the four assessments that we designed for the study.

In the video assessment, students were asked to solve one problem using a virtual whiteboard

called Jamboard. Before starting the assessment, students used a Google doc to sign up for one

equation or expression (see Appendix A). All students had their own individual equation (of

comparable difficulty) so that no two answers would be the same. The rationale behind this

choice was to ensure academic integrity by eliminating the possibility of copying an answer from

another student. Students had to describe their methods using mathematical language and

reasoning in the video and were given a 20-point rubric for grading (see Appendix B and C). The

video assessment was administered to the study group (14 Math 3 and 14 Math 2 participants)

during Part 1.

                                                20
For the traditional multiple-choice assessment, the bank of questions and the

assessment itself were created using Canvas. The questions in the bank were structured to

prevent the use of computer-solving tools (like Photomath) or searches on the Internet.

                      Figure 7. Test bank question for Math 3: Completing the square

For example, Math 3 students were asked to demonstrate their understanding of how to complete

the square for a given quadratic equation by finding a mistake in the process, as shown in Figure

7. We set up a test bank in Canvas so that different questions would be generated for each

student. The assessment consisted of 10 questions covering three topics from the week's lesson.

The rubric for the multiple-choice assessments was on a two-point scale where students scored 1

point for any answer and 2 for the correct answer. The assessment, once begun, must be

completed in 60 minutes.

       The free-response assessment was the methodology that the instructors used during a

regular in-classroom school year. We employed a PDF file and the Kami extension for this

method, allowing students to write on a PDF file and then submit it as a new file with

annotations included. The Kami extension has an equation writing feature that benefited students

                                                   21
who did not have a touchscreen. The scoring for this assessment is based on a 5-point rubric (see

Appendix C) for each problem. There was no time limit on this assessment, and students could

work on it at their convenience.

       The interactive Google Slides assessment was similar to a free-response assessment

above, except that it employed the use of the Google Slides and the Peardeck extension for

                             Figure 8. Peardeck slide with a student solution.

completion. Students were asked to solve two problems on two separate slides in the Peardeck,

see Figure 8. We utilized the 5-point rubric for each problem on this assessment as before. There

was no time limit on these assessments, and students could work on them even after the class

session had ended. (One drawback for the Peardeck is that it is more tedious to write on the slide

to show your work if you do not have a touchscreen.)

                                     5.3      Hypothesis testing

       Since the online assignments were different from usual in-class activities, students

considered them easier or harder depending on the electronic tool. We hypothesized that

participants would perform better on the assessments that they perceived as easier to complete,

                                                    22
not that the assessment content was considered easy but that the actual assessment process (tool)

seemed more manageable for them. To determine the preferred online assessment type, we

administered a survey. We asked students which assessments were easy to complete and which

were most difficult using the Likert scale [1 very easy to 5 very difficult] (Likert, 1932). Since

we noticed that consistent participation in the assessments and activities during distance learning

was also an issue, we compared the participation rates between the assessments. We compared

the results on multiple-choice questions to all the other assessments since we found this

assessment type to be perceived as the most easily completed from the survey results.

       Additionally, we tested our hypothesis that participation rates are higher on the

assessments students found most easy to complete. The graphs in Figures 9 and 10 show the

results for all participants in groups Math 3 and Math 2 (not just the study group). Note that the

Edpuzzle assessment was not offered in Part 1, so we compare the multiple-choice test in our

Hypothesis testing.

                             Figure 9. All students in Math 3 survey results

                                                   23
Note that three quarters of Math 3 students found Edpuzzle (which is a video lesson with

embedded questions) easiest to complete. We found a similar result for Math 2 students.

                             Figure 10. All students in Math 2 survey results

We phrased our research question as a hypothesis in statistical terms and tested it with p-value of

0.05 using our collected data.

    1. Did students in both the Math 2 and Math 3 study groups have higher means on the

       assessment students felt were easier to complete?

       Our hypothesis is that ^o <        where ^is the assessment students found most easy to

       complete and ^is any of the other assessments.

       H0: The mean of the easily completed assessment as determined by the survey is

             less than or equal to the mean of all other assessments.

       H^: The mean of the easily completed assessment as determined by the survey is

             greater than all other assessments.

                                             H0:    ^0 * ^P
                                             H a'   ^0 < ^P

                                                    24
a = 0.05

We compared student scores on various assignments using the Students t-test, and we

summarized the results in the following tables for Math 2 and Math 3.

                             Tests                             t       p-value      Confidence interval
                                                     Math 3

            Peardeck (oa) vs. Multiple Choice (p)            3.18       0.998         [0.3775,1.7625]
              Video (oa) vs. Multiple choice (p)              0         0.500         [-1.249,1.2487]
              Mixed (oa) vs. Multiple choice (p)          -1.028        0.157          [-1.53,0.5099]
                                                     Math 2

              Mixed (oa) vs. Multiple choice (p)          -1.816        0.04          [-2.985,0.1845]
              Video (oa) vs. Multiple choice (p)             -1.41      0.08          [-2.726,0.5057]
          Free Response (oa) vs. Multiple choice (p)       -1.906        0.03           [3.097,0.1168]
                  Table 1. t-test results for scores vs. perceived difficulty of the assessment

Math 3 study group data analysis: Our level of significance for the t-test is a = 0. 05, and the

obtained p-values for the three assessments tested for this group were 0.998, 0.5, and 0.157.

Therefore, we can conclude there is insufficient evidence to support the claim that students

scored better on the assessment that they thought was most easily completed (which was the

multiple-choice) compared to all other assessments in Part 1 of the research Math 3 group.

Therefore, Math 3 students performed similarly on all types of assessments.

Math 2 study group data analysis: The four assessments for this group were compared using a

Student’s t-test for means. We again compared the multiple-choice assessment to all the other

assessments. The level of significance for the hypothesis test was a = 0. 05, and thep-value for

the three assessments tested were 0.04, 0.08, and 0.03. Therefore, for this group, we can

conclude there is sufficient evidence to support the claim that students scored better on the

multiple-choice assessment that was most easily completed when compared to the mix

free-response/multiple-choice assessment and the free-response Jamboard. However, at

a = 0. 05,there was insufficient evidence to support the claim that students performed better on

                                                        25
the preferred multiple-choice assessment than the video assessment. However, our testing shows

that Math 2 study group participants performed significantly better on the assignments they

thought were easy to complete with a = 0.10. This result would apply to all categories of tasks,

which is quite interesting and confirms our expectations.

        For the second question, we compared the participation rates of the assessments for both

groups. We used a 2-sample t-test for the difference between two proportions. From the results

for the student survey, we determined which assessment students found most easy to complete,

which was the multiple-choice assessment, and compared it to the other three assessments, free

response, video, and mixed assessment.

   2.      Our specific research question formulated as a statistical hypothesis is the following:

        Is participation in the assessment higher for the multiple-choice assessment as compared

        with all other assessments?

We hypothesize that p 0 < p Ewhere p E is the proportion of students that completed the

multiple-choice and p0is the proportion of students completing all other assessments.

        H0: The proportion of students completing the multiple-choice assessment is

             less than or equal to all other assessments.

        H^: The proportion of students completing the multiple-choice assessment is

             greater than all other assessments.

                                            H0:   ?o - Pe
                                            Ha:   Po < Pe
             a = 0.05

        The level of significance is a=0.05. For the Math 3 study group, we see that, in general,

there is not enough evidence to support our claim that student participation in completing the

                                                   26
assignment is greater on the most easily completed assessment activities. However, for the video

assessment, where the obtained p-value was 0.0333, we support our hypothesis that participation

is significantly better on multiple-choice questions than on video with embedded questions.

                           Tests                             t       p-value    Confidence interval
                                                    Math 3

           Peardeck (oa) vs. Multiple Choice (p)          -1.018      0.1542     [-0.2063,0.0635]
             Video (oa) vs. Multiple choice (p)           -1.833      0.0333    [-0.4292,0.00006]
             Mixed (oa) vs. Multiple choice (p)           -1.018      0.1542     [-0.2063,0.0635]
                                                    Math 2

             Mixed (oa) vs. Multiple choice (p)           -2.763      0.0029     [-0.6878,-0.1693]
             Video (oa) vs. Multiple choice (p)           -3.055      0.001      [-0.7619,-0.2381]
         Free Response (oa) vs. Multiple choice (p)        -2.763      0.0029    [-0.6878,-0.1693]
                                  Table 2. Statistics for hypothesis testing

For the Math 2 study group, however, we can conclude that there is sufficient evidence to

support the claim that the proportion of students completing the multiple-choice assessment was

greater than all other assessments in our research. The p-values for the tests were 0.0039, 0.001,

and 0.0029. Therefore, the participation rates were significantly higher for the Math 2 group on

multiple-choice assignments versus any other assignments. This result may be related to the fact

that Math 3 students are more mature and try to complete more tasks assigned to them.

                                         5.4       Data Analysis

                                          Comparing means

       Since some of the rubrics for the assessments were of different point ranges, we

converted all grades to a 5-point scale in order to compare the means. The Math 3 study group

had the highest mean of 4.43 and the smallest standard deviation of 0.65 on the Peardeck

assessment. Students scored lowest on the assessment that had a mix of both free response and

                                                     27
multiple choice. The mean of the mixed assessment was 2.84, and the standard deviation was

1.51. Although the video assessment did not have the lowest mean, it did have the highest

standard deviation of 2.0. Note that three students from the Math 3 group did not complete the

video assessment, explaining the high standard deviation for this assessment type compared to

the others.

               Part 1     Assessments                   Mean     Standard Deviation
                                 Assessment               3.36              2.0
                                   Video
                                 Assessment               3.36             1.08
                                Multiple choice
               Math 3
                                 Assessment               4.43             0.65
                                  Peardeck
                                 Assessment               2.85             1.51
                                    Mix
                               Assessment Video           2.61              2.4

                             Assessment Multiple          3.72              1.7
                                   Choice
               Math 2
                               Assessment mix             2.32             2.33

                                  Assessment              2.23             2.38
                            Free Response Jamboard
                          Table 3. Means on assessments for Math 2 and Math 3

For the Math 2 study group, we found the highest mean for the students was on the

multiple-choice assessment at 3.72 with a standard deviation of 1.7, which was also the smallest

standard deviation for the four assessment types. Hence student scores were consistent and quite

good. The multiple-choice also had the highest level of participation for the Math 2 study group.

Note that the video assessment for this group had the largest standard deviation of 2.4, as many

of the students in this group were not participating in the assessment and scored 0.

                                  Pre-assessment Activities

                                                   28
We offered various independent weekly pre-assessment activities with feedback to all students

for additional learning and practice related to topics covered during class sessions. In tables 4-7,

we compare the scores on pre-assessment activities with the assessment tasks for both study

groups, Math 3 and Math 2. For Math 3, the largest mean we see in the pre-assessment activities

is 3.54 for the video with embedded questions called Edpuzzle.

      Math 3       Pre-activity   Pre-activity     Assessment      Pre-activity     Pre-activity   Assessment
                   Desmos Slide      Free            Video         Canvas Quiz         Kami         Multiple
                                   Response                                                          choice
  Standard Error       0.48           0.57             0.53             0.32               0.56       0.29
      mean             2.79           2.43             3.36             2.36               2.14       3.36
      Mode              4               0                5                3                 0         3.91
     Median            3.5             2.5            4.375               3                2.5        3.48
     Standard
     deviation         1.81           2.14               2              1.22               2.08       1.08
 Sample Variance       2.54           4.99             2.69             1.16               4.23       1.08
     skewness         -0.54             0              -1.03           -1.11               0.11      -0.21
       min              0               0                0               0                  0         1.3
       max              5               5                5               4                  5          5
      range             5               5                5               4                  5         3.7
      count             14             14               14               14                 14         14
    No. Zeros           3               5                3               2                  6          0
 Participation %       0.79           0.64             0.79             0.86               0.57        1
                        Table 4. Math 3 statistics for 14 participants on pre-activities

The students had a higher mean on this activity when compared to the assessment in the same

week, which was 2.84. However, the data for the Edpuzzle is highly skewed due to the high

participation of students who scored low on the activity. The lowest means are on the Kami

activities at 2.14, 1.86, and 1.57. All the assessments' participation rate is high with 100%

participation in the multiple-choice and Peardeck, while the mixed assessment had 93%

participation and the video assessment had 79% participation. The lowest participation rate was

on the free-response Kami assignments at 64%, 57%, and 64% of students turning it in for the

pre-assessment activities. The highest participation rate was for the Edpuzzle activity, at 93% of

                                                      29
students completing this activity. For the Math 2 group, the DeltaMath activities had the highest

means at 3.21 and 2.36. The mean for the DeltaMath activity was higher than the mean for the

assessment of the same week. The Desmos activity for Math 2 that was most creative had the

lowest mean at 1.07.

     Math 3        Pre-activity    Pre-activity     Assessment      Pre-activity     Pre-activity   Assessment
                      Kami         DeltaMath         Peardeck        Edpuzzle           Kami           Mix
 Standard Error        0.44            0.52             0.17            0.29             0.42          0.4
      mean             1.86            3.07             4.43            3.54             1.57          2.84
      Mode              0                5               5              3.75              0            4.17
     Median             2                3              4.75            3.75              2            2.92
    Standard
    deviation          1.66            1.94             0.65             1.09            1.55          1.51
     Sample
    Variance           2.49            2.72             0.41             1.59            2.44          2.01
    skewness           0.26            -0.63           -0.44            -2.94            0.27         -0.43
      min               0                0              3.5               0               0             0
      max               5                5               5               4.4              4             5
      range             5                5               1.5             4.4              4             5
      count             14              14               14              14               14            14
    No. Zeros           5                3               0                1               6             1
 Participation %       0.64            0.79               1             0.93             0.57          0.93
                    Table 5. Math 3 statistics for 14 participants on pre-activities continued

As with the assessments, we see that the standard deviations on all activities are high ranging

from 1.7-2.5. The participation rate for the Math 2 study group is much lower on all activities

and assessments compared to the Math 3 group. The lowest participation was on the mixed

free-response and multiple-choice assessment, with only 50% of students completing the

assessment. The highest participation rate was on the multiple-choice assessment, in which we

see 86% of students completing the assessment. The participation in the activities was very low

for this group as well. The lowest percent of participation was on the free-response Kami

assignments, with 21% and 29% of students completing them. The highest participation rate in

the non-multiple choice type activities was the DeltaMath activity at 64%, requiring students to

input the answers in an equation box.

                                                       30
Assessment
                   Pre activity      Pre-activity     Assessment       Pre-activity     Pre-activity    Multiple
     Math 2        Canvas quiz       DeltaMath          Video             Kami             Kami          Choice
  Standard Error       0.6                0.66            0.64             0.67                0.57       0.45
      mean            2.32                3.21            2.61             2.32                1.25       3.72
      Mode              0                  5                0                0                  0         4.77
     Median            2.5                 5               3.5             1.25                 0        4.435
    Standard
    deviation         2.24               2.49              2.4             2.49                2.14       1.7
      Sample
     Variance         5.01                6.18            5.74             6.22                4.57       2.9
     skewness         0.06               -0.67            -0.19            0.16                1.29       -1.7
       min              0                  0                0                0                  0          0
       max              5                  5                5                5                  5          5
      range             5                  5                5                5                  5          5
      count             14                 14               14              14                  14         14
    No. Zeros           6                  5                6                7                  10         2
 Participation %      0.57                0.64            0.57              0.5                0.29       0.86
                            Table 6. Math 2 statistics for 14 participants on pre-activities

The tables show that overall participation rates in Part 1 of the study were low, and the scores

were also low on the pre-activities. This posed a new issue of modifying the teaching to motivate

the students to engage more in their online learning and obtain better assessment results in Part 2

as our data collection continued.

                                                    Correlation

       We looked carefully at the variables that may have influenced the participant's

performance during Part 1 based on the survey data, assessment scores, and class records. We

display our findings in Table 8.

       Our data analysis shows a very high positive correlation between the grade at the end of

the academic quarter and the Peardeck scores for the Math 2 study group. Hence this assessment

may be a good predictor for students' overall performance.

                                                          31
Assessment
                                                                                                      Free
                   Pre-activity    Pre-activity    Assessment       Pre-activity    Pre-activity    Response
     Math 2        DeltaMath         Desmos           mix             Desmos           Kami        Jamboard
 Standard Error       0.67             0.57            0.62              0.5            0.63          0.64
      mean            2.36             1.07            2.32             1.21             1.93         2.23
      Mode              0               0                0               0                0            0
     Median            1.5              0              2.125             0                0           1.5
    Standard
    deviation          2.5             2.13            2.33             1.88            2.36          2.38
     Sample
    Variance          6.25             4.53            5.44             3.53             5.57         5.65
    skewness          0.11             1.57            0.15             1.13            0.45          0.14
       min              0               0                0               0                0            0
      max               5               5                5                5               5            5
      range             5               5                5                5               5            5
      count             14              14               14              14               14           14
    No. Zeros           9               8                7               7                11           6
 Participation %      0.36             0.43             0.5              0.5            0.21          0.57
                    Table 7. Math 2 statistics for 14 participants on pre-activities continued

Additionally, we found a high positive correlation between multiple-choice assessments and the

final grade at the end of the academic quarter for both the Math 2 and 3 study groups. This is

interesting as students found the multiple-choice tasks easiest to complete, and the vast majority

participated in them. Also, the overall GPA had a high positive correlation compared to their

grade for the quarter, which may be related to student motivation and maturity. Note that for the

Math 2 study group, except for the multiple-choice assessment, there was a high correlation

between the participant’s results on all other assessments and their overall GPA (i.e., more

students successfully completed multiple-choice quizzes than GPA would predict). We found no

correlation between gender and performance on the assessments. That means all students were

affected by the new online learning environment similarly, regardless of their gender.

                                                       32
CAASPP
                                      Grade at end                Peardeck                   Scores
                                       of quarter         GPA      grades       Gender       grade 8
                                                      Math 3
      Video Assessment                      0.79          0.51      0.68         0.05           0.29
      Multiple Choice Assessment            0.84          0.67      0.69          0.1           0.18

      Peardeck Assessment                   0.31          0.16      0.26          -0.2          -0.08
      Mixed assessment                      0.75          0.64      0.63         -0.15          0.16
      Grade at the quarter                   1            0.84      0.72         0.09           0.18
                                                      Math 2
      Video Assessment                      0.78          0.81      0.67         -0.45          0.51
      Multiple Choice Assessment            0.71          0.57       0.7         -0.21          0.38

      Free Response Jamboard                0.84          0.92       0.8         -0.33          0.7
      multiple choice/free response
      mixed                                 0.8           0.92      0.88         -0.39          0.64
      Grade at the quarter                   1            0.92      0.96         -0.28          0.6
                Table 8. Correlation between assessment score and overall student performance

                                      5.5          Part 1 Conclusions

       Part 1 of our research was the first attempt to teach and assess all students' learning

online. For the Math 2 study group, our results show that the learner's perceived ease of the

assessment leads to higher participation rates but not necessarily to higher scores and better

understanding. For both the Math 2 and 3 study groups, the multiple-choice was perceived as the

easiest assessment. Trying to understand why students found the multiple choice easiest, we

analyzed survey responses related to student explanations. Several students cited that if they

could see their answer matched one of the options in the multiple-choice responses, then they felt

they did the problem correctly. Other typical responses were similar to the one below:

        Because it gives me an idea what the answer can look like. The other choices made it

       easier to check my work..

                                                        33
We noticed that the participation rate in the free-response, pre-assessment activities using the

Kami extension was low for both study groups. For Kami, students had to show their work

similarly to assessments written on paper during regular in-class sessions. However, this

impacted the students without a stylus and a touchscreen since they could find the task more

tedious and might have opted not to complete it for this rather technical reason.

We evaluated our data from Part 1 carefully before designing activities and assessments for the

next steps in our study.

                                            6.      Part 2

                                      6.1        Introduction

        Part 2 of this study was conducted in Winter 2020 when students and instructors were

somewhat experienced with the online environment and activities. Using the experience from

Part 1, we created better and more engaging online lessons and assessments adjusted for the ease

of student use. We investigated three different assessments in this part of the research project: an

instructor video embedded with questions, a free-response assessment, and a DeltaMath

assessment. The DeltaMath assessments replaced Canvas quizzes to eliminate the self-standing

multiple-choice assessments from Part 2 of the study.

       Our observation in Part 1 was that students were not submitting the video-based

assessments that required making a video and explaining their thinking and process. Since these

video assessments had the highest percentage of students not completing them, we removed them

altogether from Part 2. We replaced them with an instructor video embedded with frequent

multiple-choice and free-response questions. The instructor video assessment was created using

the online platform Edpuzzle. There was a mix of free-response and multiple-choice questions

                                                  34
embedded in the various parts of the video lessons. This form of assessment provided students

with the opportunity to review, practice, and apply the new concepts learned in the course while

also assessing their understanding. The Edpuzzle platform uses a percentage for question

correctness, so we used the standard 5-point rubric for the free-response questions converting

one point to 20% correctness. We also gave the multiple-choice questions a scale of 5 points for

each correct answer and 2.5 points for all incorrect answers.

       In Part 2, the instructors were given access to the full version of the DeltaMath software

that provided them with the ability to create assessments. Students have been using the

DeltaMath platform for asynchronous practice assignments at this point in the school year. We

felt this would be an alternate way to assess student learning that replaced the multiple-choice

assessments from Part 1. The problems selected for the evaluation were based on the class

assignments for the week in the DeltaMath platform: One question per new concept or skill. We

also used the 5-point scale on each question. The site does not recognize different forms of an

answer and instead requires a precise response. Any solution outside of those parameters results

in zero points awarded. For example, one such question asked students to find the y-intercept for

a polynomial function and required only the y-value as the correct solution. Therefore, if a

student obtained the answer, 12, and input the response as y=12 or (0,12), they scored zero

points. Students could submit their work to the Canvas site for partial credit on answers marked

wrong by the system to address these issues. This assessment had a time limit of 60 minutes to

complete once it started, and in Part 2, to assure student participation, assessments were

completed during the Friday supervised synchronous sessions.

       The final assessment in Part 2 used the free-response questions in Kami, i.e., students

could write the answers as they preferred. There was no change in the parameters from Part 1 to

                                                35
Part 2 on this assessment. Again, we used the 5-point scale, and students did not have a time

limit and could finish at their convenience.

       At the end of Part 2, we gave a student survey to all the Math 2 and 3 students to

complete voluntarily. 55 Math 2 students responded, and 174 Math 3 students responded to the

survey. Note that all the instructors of Math 2 and 3 from academic quarters 1 and 2 (teaching or

not teaching study groups in Part 1 or 2) were also given a survey to provide feedback from the

teachers' experience with the assessments we have designed and implemented.

                                    6.2        Hypothesis tests

We compared three assessments for both Math 2 and Math 3 groups. From the survey at the end

of the academic quarter, we found that the Edpuzzle assessment was considered the most easily

completed by Math 2 and Math 3 above all other assessments.

We considered the following research question and tested it using our data.

       Do students in both the Math 2 and Math 3 study groups have higher means on the

       assessments perceived as easier to complete?

       Our hypothesis is that u0 < uE where uE is the Edpuzzle assessment perceived as easy to

       complete, and u0is any of the other assessments.

       H0: The mean of the easily completed assessment as determined by the survey is

             less than or equal to the mean of all other assessments.

       H^: The mean of the easily completed assessment as determined by the survey is

             greater than all other assessments.

                                               H0: U0 S UE

                                               H«! U0   <   UE

                                                   36
a = 0.05

The level of significance for the hypothesis test is a = 0. 05, and thep-values when comparing

both groups were 0.977, 0.906, .510, and 0.849. Thus, all p-values were greater than the level of

significance; therefore, we can conclude there is insufficient evidence to support the claim that

students scored better on the assessment that was perceived as most easily completed, which was

the Edpuzzle assessment.

                       Math 3 Tests                        t       p-value     Confidence interval
              DeltaMath (O) vs. Edpuzzle (E)             2.08        .977        [0.01679.1.5232]
            Free Response (O) vs. Edpuzzle (E)          1.347        .906        [-0.2917,1.4317]
                       Math 2 Tests                        t       p-value     Confidence interval
              DeltaMath (O) vs. Edpuzzle (E)            0.025        .510        [-1.627,1.6667]
            Free Response (O) vs. Edpuzzle (E)           1.055      .849         [-0.6793,2.0993]
                    Table 9. Testing Hypothesis on performance on easy vs. harder tasks

Therefore, we conclude that the type of online assessment played no significant role in student

performance. This is an interesting finding, as it means that teachers can choose assignments that

fit their lessons and technical experiences without affecting student performance.

                                       6.3       Data Analysis

                                        Comparing means

       Note that for consistency, we converted all scores for the assessments to a 5-point scale

for equitable comparison. For the Math 3 study group, the overall assessment means were

relatively high, with the highest mean for the DeltaMath at 3.91 with a standard deviation of 1.31

and the lowest for the Edpuzzle, with a mean of 3.14 and a standard deviation of 0.78. The

free-response assessment had the highest number of students not participating at 2 out of the 17,

with the mean in the middle at 3.71 with a standard deviation of 1.56. We conclude that at this

level, all types of assignments were supporting student learning and their performance. For the

                                                   37
Math 2 study group, overall means for our three types of assessments were not as high as for the

Math 3 study group. The highest one for the free-response type at 3.52 with a standard deviation

of 1.65 had the smallest standard deviation of the three assessments. Therefore, these statistical

data are compatible with the Math 3 results. However, the mean scores for the

               Part 2       Assessments                   Mean      Standard Deviation
                               Assessment Edpuzzle          3.14              0.78
                              Assessment DeltaMath          3.91              1.31
               Math 3
                                   Assessment               3.71              1.56
                                  Free Response
                               Assessment Edpuzzle          2.81              1.78
                              Assessment DeltaMath          2.83              2.26
               Math 2
                                   Assessment               3.52              1.65
                                  Free Response
             Table 10. Means on various types of assessments for Math 2 and Math 3 study groups

Edpuzzle assessment at 2.81 and DeltaMath at 2.83 were relatively low, which means that

participants did not perform as well as we hoped. Therefore, there is a need to modify these

assessments for students at this level.

                             Comparing pre-assessment activities

In tables 11-14 that follow, we compared the data for pre-assessment activities with the related

assessments of the same week.

       For the Math 3 study group, the largest mean obtained was on the DeltaMath activity at

4.12; however, the data for this activity is highly skewed and with a large standard deviation of

1.96. The skewness of the data for this activity is due to only having two scores of 5 or 0 for all

participants. The smallest mean was a Canvas quiz with a mean of 1.62 and a standard deviation

of 1.7. For the Math 2 group, the highest mean obtained for the pre-assessment activities is on

the Kami activity with a mean of 3.15 with a standard deviation of 2.08. This surprised us,

                                                     38
considering that in Part 1, the Math 2 group had the lowest means and participation in the Kami

activities.

      Math 3              Pre-activity     Pre-activity     Assessment      Pre-activity       Pre-activity        Assessment
                             Kami          DeltaMath         Edpuzzle        Edpuzzle          DeltaMath           DeltaMath

  Standard Error             0.57               0.54            0.19               0.53            0.48               0.32
       mean                  2.65               3.26            3.14               3.82            4.12               3.91
       Mode                    0                 5              2.94                5               5                  5
      Median                  3.5                5              2.99                5               5                 4.15
     Standard
     deviation               2.34              2.24             0.78               2.19            1.96               1.31
 Sample Variance             5.46               5.03             0.6               4.78            3.86               1.71
     skewness                -0.25              -0.8            0.73               -1.37          -1.87              -1.86
        min                    0                 0              1.84                0               0                  0
        max                    5                 5                  5               5               5                  5
       range                   5                 5              3.16                5               5                  5
       count                   17                17              17                 17              17                 17
     No. Zeros                 6                 5                  0               3               3                  1
 Participation %             0.65               0.71                1              0.82            0.82               0.94
                                          Table 11. Math 3 statistics on pre-activities

               Math 3                      Pre-activity                 Pre-activity                  Assessment
                                             Desmos                     Canvas Quiz                  Free Response

       Standard Error                           0.48                         0.41                         0.38
               mean                             3.97                         1.62                         3.71
               Mode                              5                            0                                5
              Median                             5                           1.25                              4

      Standard deviation                        1.99                         1.7                          1.56
       Sample Variance                          3.95                         2.88                         2.44
              skewness                         -1.59                         0.24                             -1.7
                min                              0                            0                                0
                max                              5                           3.75                              5
               range                             5                           3.75                              5
               count                             17                           17                              17
              No. Zeros                          3                            8                                1
       Participation %                          0.82                         0.53                         0.94
                                     Table 12. Math 3 statistics on pre-activities continued

                                                               39
However, participation for this group increased from Part 1 across all assignments and

assessments. The increase in participation rate could be explained by the fact that students were

more experienced with online learning, and they have realized that the grades they are earning

are going on their transcripts. During the instructor interviews, two teachers stated that students

perceived this year as having the same grade allowances and benefits as the previous semester of

the 2019-2020 school year. This semester of last year, due to an unexpected switch to online

learning, to support the progress, teachers in the math department gave all students credit for the

semester regardless of their effort or grades during distance learning. Hence some of the students

in our Part 2 study may have expected similar treatment and did not apply themselves as much as

they should.

     Math 2        Pre-activity      Pre-activity     Assessment      Pre-activity    Pre-activity   Assessment
                      Kami           DeltaMath         Edpuzzle        Edpuzzle       DeltaMath      DeltaMath

 Standard Error       0.58                0.7             0.46            0.57            0.72          0.49
      mean             3.15              3.08             3.41            3.04            2.69          2.64
      Mode              5                  5              3.75              5               5            0
     Median             4                  5              3.75              4               5           3.04
    Standard
    deviation         2.08               2.53             1.65            2.07            2.59          1.78
 Sample Variance      4.31               6.41             2.71            4.27            6.73          3.18
    skewness          -0.51              -0.54           -1.54            -0.37           -0.18        -0.54
       min              0                  0               0                0              0             0
       max              5                  5               5                5               5            5
      range             5                  5               5                5               5            5
      count             13                13               13              13              13            13
    No. Zeros           2                  5               2                2              6             3
 Participation %      0.92               0.69             0.92            0.92            0.62          0.85
                              Table 13. Math 2 study group statistics on pre-activities

For example, the lowest participation rate we see for this group is on the Canvas quiz that was

perceived as easy to complete, with 53% of students completing the activity and a mean of only

2 out of 5 with a standard deviation of 2.26 (see Table 14).

                                                         40
You can also read