THE DEVELOPMENT AND VALIDATIN OF A TWO-TIERED
INSTRUMENT TO IDENTIFY ALTERNATIVE CONCEPTIONS IN EARTH SCIENCE
Katherine Mangione
Leslie,
Jean E. Dockers,
Michael J. Wavering,
Abstract
Content knowledge is one of the primary focuses of teacher
certification. In this continuation of
the ACES-Q study of 2005 the original questionnaire was revised. Earth science content knowledge of 56
preservice elementary teachers in their senior year at the
“I would research the concept [solar eclipse] then I would
incorporate a fun activity such as a night observation…”
-Preservice Teacher
Theoretical
Background
The above quote
was one of several from our study that demonstrates the lack of content
knowledge prevalent in preservice teachers.
This particular quote mirrors the sentiments from a preservice teacher
from our original study:
“… to teach the solar eclipse I
would bring in many hands on materials.
I would also hold a night session to watch a solar eclipse.” (Leslie,
Dockers, & Wavering, 2005)
Content knowledge
is a primary focus for teacher certification programs. Research from the past two decades shows
strides being made in connecting content knowledge to instruction (Wilson,
Shulman, & Richart, 1987; Aubrey, 1996; Kallery & Psillos, 2001). This is in part due to Shulman’s (1986, 1987)
theoretical development of pedagogical content knowledge (PCK).
Content knowledge
plays a critical part in successful teaching.
Hashweh (1987) found that in planning instruction, teachers tend to
delete details that they do not understand.
By doing so they may be unwittingly passing on their own alternative
conceptions to their students. With this
in mind, we should “reexamine our assumption that subject matter knowledge
required for teaching can be acquired solely through courses taken in the
appropriate university department” (Grossman, Wilson, & Shulman, 1989, p.
23). We must be aware of our preservice
teachers’ needs to develop scientific understandings about concepts they will
be responsible for teaching and possible misconceptions that they may bring to
the learning environment.
Earth science is
the one content area that is present at all levels of the Arkansas Science
Frameworks, and is therefore required to be taught at all levels. The information gathered in this study will
potentially benefit not only the preservice teachers in their understanding of
alternative conceptions, but will aid professors in both the education and
science departments in restructuring classes to address alternative conceptions
via classes designed for conceptual change.
A great deal of study on various alternative conceptions has been generated over the past several decades. However, there has not been a push to create a teacher friendly instrument that specifically identifies earth and space science misconceptions. One goal of this study was to create a questionnaire that will enable professors of preservice teachers of all grade levels to assess their students’ alternative conceptions in earth and space science. Another aspect of this study was to identify alternative conceptions of earth science held by preservice teachers.
Many terms are used to describe the phenomenon of nonscientific conceptions in the learning environment. Some of these include misconceptions, naïve beliefs, persistent pitfalls, or science fragments (Wandersee, Mintzes, & Novak, 1994). For this study we are choosing to use the term alternative conceptions, defined as scientific ideas of an individual that are at odds with or do not match current scientific understandings to represent these ideas. We choose this definition because it recognizes the learner as an individual trying to makes sense of the world with understandings that they have constructed and that work for them (Leslie, Dockers, & Wavering, 2005).
Children’s earth
science concepts have been investigated by many researchers in the field of
science education. (Jones & Lynch, 1987; Baxter, 1989; Schoon, 1989 &
1992; Trumper, 2001 & 2001).
Researchers have also paid close attention those earth science
understandings held by preservice and inservice teachers alike (Stofflett,
1993; Atwood & Atwood, 1995 & 1996; Schoon, 1995; Trundle, 1999; Trundle,
Atwood, & Christopher, 2002). While
several alternative conceptions in earth science have been identified (e.g.,
phases of the moon are caused by the earth’s shadow covering the moon, severity
of winter can be predicted by observing animal and plant coverings, the sun is
closer to the earth in summer time causing us to be warmer) one striking observation
must be noted: teachers hold many of the
same alternative conceptions that their students hold.
Research Questions
The
questions driving this study are:
Instrument
The decision to
use a multiple choice format stemmed from positive support found in the
research. Wandersee and Mintzes (1987)
found that multiple choice tests were the second most common research method,
after interviews, used in identifying alternative conceptions. Tamir (1990) provides several justifications
for using multiple choice tests. Some of
these include: they can cover a wide range of topics in a relatively short
time, if designed well, they can be used to measure different levels of
learning, they are objective in scoring and therefore more reliable, they are
easily scored, they are suitable for item analysis which allows for test
improvement, and they avoid penalties for students who know their subject but
may be poor writers.
The original
instrument was developed and piloted for the 2005 AETS (Association for the Education
of Teachers in Science) Conference in
Item analysis of
the original ten question instrument was conducted.
Several of the
questions from the original instrument were dropped because they were naïve or
failed the rigors of item analysis. Original
questions, retained for the second instrument, as well as new questions were organized
and focused under four areas of earth science:
1) solar system,
objects and changes in the earth and sky;
2) earth’s
history, structure and surface of earth;
3) earth systems,
rock and water cycles; and
4) climate,
weather, and atmosphere.
Distractors on the
original instrument were refined via item analysis and a more thorough review
of the literature. Additional
questionnaire items were selected from those used by other researchers, as
reported in the literature (e.g. Educational Testing Services’ Earth and Space
Sciences: Content Knowledge, The Praxis Series and Libarkin and
Participants were asked
to choose the best answer instead of choosing
the correct answer. Schwab (1963) shares that when students are
asked to choose the best answer they are forced to analyze the various
options. Multiple choice items of this
type cater to a wider range of cognitive abilities. Schwab (1963) and Tamir (1971) suggest that
when choosing distractors one should avoid using transparently irrelevant
answers. By choosing distractors from
students’ preconceptions and those that differ in their degree of wrongness,
distractors may serve as traps. Tamir
(1990) asserts that
“distractors in a multiple choice item function much like one of the standard procedures in a Piagetian classical interview. There the interviewer is not fully satisfied even when a child gives a correct answer, understanding is checked by suggesting an alternative answer.” (p. 564)
The development of multiple choice
tests on students’ alternative conceptions has the potential to make valuable
contributions to both the research of alternative conceptions as well as the
process of helping science teachers use findings in this area of research (Treagust,
1988).
Participants
The ACES-Q II was
administered to 56 preservice elementary teachers enrolled in their senior
block courses at the
Validity
Validity
is the degree to which an instrument measures what it is supposed to. Several measures of the validity of the
ACES-Q II are discussed below.
Face validity,
sometimes considered a sub form of content validity, was determined by
subjecting the ACES-Q II to the scrutiny of several professionals in the field
of science education. These experts concluded
that the ACES-Q II measures content knowledge and possible alternative
conceptions in earth science among participants taking the questionnaire.
Content validity
was further determined by identifying each of the earth science content
standards that the 20 questions addressed.
See Table 1 for clarification.
These four categories mirror the earth and space systems standards for
the
Table 1
Specifications for Instrument
Questions
Content Area |
Question Number |
Solar System |
1,2,3,4,5,13 |
Earth History |
11,14,15,20 |
Earth Systems |
6,7,8,12 |
Climate/Weather |
9,10,16,17,18,19 |
Reliability
Reliability
is a measure of an instrument’s ability to provide consistent results. The researchers followed several steps
suggested for improving an instrument’s reliability. These included: piloting the first instrument
and increasing the length of the questionnaire from ten to 20 questions.
Item
analysis was also performed to identify those questionnaire items with the
greatest ability to discriminate among participants. Based on a minimum acceptable discrimination
index of .20, 13 of the 20 items were above the acceptable standard.
The KR-20 is a
measure of internal consistency; that is how well individual items relate to other
items in the questionnaire as well as how they relate taken as a whole. Over all test reliability was determined to
be .75 using the ‘proc corr alpha nomiss;’ statement in SAS (Statistical
Analysis Software). An acceptable KR-20
value should be no lower than .60 and preferably .80 or higher.
Procedure
During
the fall semester of 2005, participants were identified via enrollment in their
senior block courses. Three cohorts were
chosen to participate in this study. The
researchers met with each group to inform them of the questionnaire and to gain
their consent to participate.
Data
were gathered and then examined with several factors in mind (gender, age,
college science courses taken, and earth science courses taken). The researchers focused on analyzing the
similarities and differences of test scores across these subgroups.
Results
Quantitative Data
Each of the twenty questions was designed to assess earth science content knowledge in one of four specific categories. Table 2 below shows the specific concepts tested by the corresponding question number followed by the percentage of participants who chose the correct answer and the correct response and, using a Likert scale, the means of the levels of confidence in the respondent’s answer (1 = just a blind guess… 5 = I am sure I am right) and the levels of sensibility for the reason provided (1 = makes no sense… 5 = makes perfect sense) were calculated.
Table 2
Questions on the ACES-Q11
Question |
Concept |
Answer Correct |
Reason Correct |
Confidence Mean |
Sense Mean |
1 |
Solar System |
38.18% |
48.15% |
3.1 |
3.4 |
2 |
Solar System |
60.71% |
66.07% |
3.6 |
3.8 |
3 |
Solar System |
58.18% |
23.21% |
2.8 |
3.2 |
4 |
Solar System |
44.64% |
46.43% |
3.8 |
3.8 |
5 |
Solar System |
35.71% |
32.73% |
3.1 |
3.3 |
13 |
Solar System |
41.07% |
60.71% |
2.7 |
3.1 |
6 |
Earth Systems |
33.93% |
37.50% |
2.9 |
3.3 |
7 |
Earth Systems |
44.64% |
53.57% |
3.3 |
3.7 |
8 |
Earth Systems |
60.71% |
58.93% |
2.9 |
3.2 |
12 |
Earth Systems |
50.00% |
37.04% |
2.5 |
3.3 |
9 |
Climate/Weather |
14.55% |
16.07% |
3.9 |
4.0 |
10 |
Climate/Weather |
75.00% |
78.57% |
3.2 |
3.4 |
16 |
Climate/Weather |
60.71% |
58.93% |
3.1 |
3.4 |
17 |
Climate/Weather |
39.29% |
25.00% |
2.8 |
3.1 |
18 |
Climate/Weather |
39.29% |
63.64% |
2.3 |
2.8 |
19 |
Climate/Weather |
17.86% |
21.43% |
4.1 |
4.3 |
11 |
Earth History |
90.91% |
59.62% |
3.2 |
3.4 |
14 |
Earth History |
82.14% |
31.48% |
2.2 |
2.7 |
15 |
Earth History |
69.64% |
48.21% |
2.6 |
3.0 |
20 |
Earth History |
69.64% |
29.63% |
3.0 |
3.1 |
The table above shows us the percentage of participants choosing the correct answer or the correct response followed by the average confidence level and sensibility level of the question. It is interesting to note that fewer than half of the reason responses were answered correctly by at least half of the participants. This indicates to us that more than half of our preservice teachers possibly still hold alternative conceptions in earth science.
Question nine has one of the lowest
percentages for participants choosing a correct answer (14.44%) and a correct
response (16.07%). Conversely, it also
has one of the highest confidence means and sensibility means. Question nine asked students to identify the
biome that best describes
Table 3 shows the participant’s mean scores for answer, response, and total score. Total score was calculated by the participant’s ability to choose both the correct answer and reason for each question. Each question was worth one for a total of 20 possible points for the questionnaire. If the participant chose a correct answer but provided an alternative conception for the reason she or he received a score of zero for that question. Alternately, if the participant did not choose a correct answer but chose a scientific conception for the reason he or she received a score of zero for that question.
Table 3
Means According
to Subgroups
|
N |
Mean |
SD |
Gender |
|
|
|
Female |
50 |
7.2 |
2.5 |
Male |
5 |
10.2 |
3.5 |
Earth Science Classes Taken |
|
|
|
1 course |
28 |
7.6 |
2.7 |
2 courses |
26 |
7.4 |
2.7 |
4 courses |
1 |
4.0 |
- |
Total Science Classes Taken |
|
|
|
3 to 4 courses |
50 |
7.5 |
2.7 |
5 to 7 courses |
5 |
6.8 |
3.0 |
Inspection of the means (Table 3)
showed differences among all subgroups.
On average, men scored higher than women on the ACES-Q II. These findings were consistent with those on
the ACES-Q (Leslie, Dockers, & Wavering, 2005).
An analysis of variance was
calculated using the ‘proc glm’ statement in SAS to see if differences among
the means were significant. At the alpha
.05 level the analysis failed to reveal a significant difference for number of
earth science courses taken (F (4, 50) = 0.78, p = .46) and number of science
classes taken (F(4, 50) = .01, p = .91) on the total score. However difference among total scores
according to gender (F(4, 50) = 6.03, p = .02) were significant.
Correlations among
the confidence scores, sensibility scores, gender, and total scores were
examined to see if any relationships exist.
There was a slight positive correlation .32 between gender and the total
score on the ACES-Q II. There was also a
slight positive correlation between an individual’s confidence and sensibility
scores and their total score (.42 and .32 respectively). Each of these scores was significant at the
alpha .05 level. These scores were
similar to
Qualitative Data
One question on the ACES-Q II was designed to allow participants to share their thoughts regarding teaching and content. The question asked participants how they would teach about a solar eclipse to the grade level of their choice. Of our 56 participants, 53 chose to answer this question. Several salient themes emerged as we coded the responses from the participants. These themes were divided into the following sections: pedagogy and content knowledge. A list of several reoccurring themes and their frequency can be found in Table 4 below. Of the content knowledge themes listed below it is important to note that nine of the participants noted lack of sufficient content knowledge within themselves. This supports the findings of Smith and Neale (1989), Parker and Heywood (2000), Kallery and Psillos (2001) and Weiss, Banilower, McMahon, and Smith (2002) showing that teachers feel they lack specific content knowledge, especially in the areas of math and science.
Table 4
Frequency of Emerging Themes
from the ACES-Q II Question 21
Pedagogy |
Content Knowledge |
Schema Activation (3) |
Sun, Moon, & Earth (10) |
School Textbook (3) |
Sun Between Earth & Moon (1) |
Internet (5) |
Earth’s Shadow Covers Sun (1) |
Model (20) |
Sun Passes in Front of Earth (2) |
Books & Literature (17) |
Moon Blocks Sun (7) |
Posters & Visual Aids (13) |
Moon Passes Between Earth & Sun (1) |
Observe Directly (3) |
Read Up, Review, & Learn More (9) |
Movie & Video (7) |
|
Explain (5) |
|
Discuss (7) |
|
Question Students (1) |
|
Draw, Art, & Drama (6) |
|
Several
of the participants responded in a way that allowed us to discover if they held
alternative conceptions or scientific understandings. Eleven of the 53 participants answered with a
plainly stated alternative conception or science fragment. Some examples of common answers follow:
Four participants answered
with a clearly stated scientific understanding.
Some examples of these include:
These fifteen responses were reexamined
to see if there were any similarities or differences in pedagogical choices
based on the content knowledge. Both groups shared pedagogical approaches using
models and demonstrations; however the group holding scientific conceptions
also suggested questioning students and activating schema.
Discussion
The two major
questions addressed by this research were 1) Can a valid and reliable multiple
choice instrument be developed to identify alternative conceptions in earth
science? and 2) What is the earth science content knowledge of preservice
elementary teachers and do they possess alternative conceptions?
Our continued research
does in fact indicate that a valid and reliable instrument can be created to
identify alternative conceptions in earth science. The researchers understand that continued
refinement of the instrument will lead to an even more stable questionnaire.
Further studies
will be carried out and individual interviews of participants will be conducted
to see if answers on the written format agree with answers in an interview
format. Data from using parallel forms
of the instrument may lend to the overall reliability of the instrument by
seeing if a relationship exists between the answers given by the same
participants on a verbal form of the questionnaire. Instructor interviews regarding the
usefulness of the instrument and a rating of the content will lend strength to
the validity of the instrument.
Despite the fact
that the reliability of the second instrument was increased when the instrument
was lengthened, the researchers feel that the overall length may be a limiting
factor of the questionnaire. Frustration
levels were easily reached and a helpless feeling of “I don’t know,” was
pervasive in the testing environment.
Our second
question focused on the content knowledge of our preservice elementary teachers
and any possible alternative conceptions they might hold. Data supports that a majority of our
preservice elementary teachers may still hold alternative conceptions in
several areas of earth science.
Schoon (1992 &
1995) further divides alternative conceptions into primary and secondary
alternative conceptions. Primary
alternative conceptions are those alternative conceptions that more prevalent
than scientific understandings (Schoon, 1992 & 1995). Secondary alternative conceptions are those
alternative conceptions that are common yet less common than scientific
understandings (Schoon, 1992 & 1995).
Data from the
ACES-Q II indicated that our preservice elementary teachers may posses several
primary and secondary alternative conceptions.
Possible primary alternative conceptions include: as the moon orbits the
earth we are able to see 100% of its surface,
These alternative
conceptions, whether primary or secondary, are not unique to our
participants. Several studies (Schoon,
1989, 1992, & 1995; Callison, 1993; Atwood & Atwood, 1995 & 1996; Schoon
& Boone, 1998; Trundle, 1999; Trundle, Atwood, & Christopher, 2002
& 2003) concur that children, adults, preservice and seasoned teachers
continued to hold alternative conceptions in earth science.
The ACES-Q II
represents four areas of earth science content addressed in the NSES and
Arkansas Frameworks. Data from the means
indicated that our preservice teachers were familiar with less than half of the
content covered on the ACES-Q II. This concerns
us as most of our preservice teachers plan to teach somewhere in
In
closing it is our hope that through awareness of alternative conceptions and
via teaching for conceptual change that all preservice teachers will become
aware of possible alternative conceptions that they hold and the possibility
that they may pass those on to their students.
We anticipate that more of our students will adopt the attitude of this
participant in our study.
“I would first study the topic to make sure I do not teach false
information.”
- Preservice Teacher
References
Atwood, V.A.
& Atwood, R.K. (1995). Preservice elementary teachers’ conceptions of what
causes night and day. School Science and
Mathematics, 95(6), 290-294.
Atwood, R.K.,
& Atwood, V.A. (1996). Preservice elementary teachers’ conceptions of the
causes of the seasons. Journal of
Research in Science Teaching, 33(5). 553-563.
Aubrey, C.
(1996). An investigation of teachers’ mathematical subject knowledge and the
processes of instruction in reception classes. British Educational Research
Journal, 22(2), 181-197.
Baxter, J. (1989). Children’s understanding of familiar astronomical events. International Journal of Science Education, 11, Special Issue, 502-523.
Callison, P. L.
(1993). The effect of teaching strategies using models on preservice elementary
teachers’ conceptions about earth-sun-moon relationships. Unpublished Doctoral
Dissertation,
Franklin, B. J. (1992) The development, validation, and application of a two-tier diagnostic instrument to detect misconceptions in the areas of force, heat, light, and electricity (Doctoral dissertation, Louisiana State University and Agricultural and Mechanical College, 1992). Dissertation Abstracts International, 53/12, 4186.
Grossman, P.,
Wilson, S. M., & Shulman, L. S. (1989) Teachers of substance: Subject
matter knowledge for teaching. In M. C. Reynolds (Ed.) Knowledge Base for
the Beginning Teacher. (pp 23-36).
Hashweh, M. Z. (1987). Effects of subject matter knowledge in the teaching of biology and physics. Teaching and Teacher Education, 3, 109-120.
Hopkins, K. D.
(1997). Educational and psychological measurement and evaluation (8th E). Allyn & Bacon.
Jones, B.L.,
Lynch, P.P. (1987). Children’s conceptions of the earth, sun and moon. International
Journal of Science Education, 9, 1, 43-45.
Kallory, M. &
Psillos, D. (2001). Pre-school teachers’ content knowledge in science: Their
understanding of elementary science concepts and of issues raised by children’s
questions. International Journal of Early Years Education, 9(3). 165-179.
Leslie, K., Dockers, J., & Wavering, M. (2005) What do they know? A look into preservice teachers’ earth science content knowledge. A paper presented at AETS 2005. January 2005.
Parker, S. & Heywood, D. (2000). Exploring the relationship between subject knowledge and pedagogic content knowledge in primary teachers: Learning about forces. International Journal of Science Education, 22(1), 89-111.
Schoon, K.J.
(1989). Misconceptions in the Earth sciences: A cross-age study. Paper
presented at the annual meeting of the National Association for Research in
Science Teaching,
Schoon, K.J.
(1992). Students’ alternative conceptions of earth and space. Journal of Geological Education, 40. 209-214.
Schoon, K.J.
(1995). The origin and extent of alternative conceptions in the earth and space
sciences: A survey of pre-service
elementary teachers. Journal of
Elementary Science Education, 7(2), 27-46.
Schoon, K. J.,
& Boone, W. J. (1998). Self-efficacy and alternative conceptions of science
of preservice elementary teachers. Science
Education, 82(5): 553-568.
Schwab, J.J.
(1963). The biology teachers’ handbook.
Shulman, L.
(1986) Paradigms in research programs in the study of teaching. In M. Wittrock,
Ed. 3rd Handbook of Research on Teaching (pp. 3-36).
Shulman, L.S. (1987) Knowledge in teaching: Foundations of the new reform. Harvard Education Review, 57 (1), 1-22.
Smith, D. C. & Neale, D. C. (1989). The construction of subject matter knowledge in primary science teaching. Teaching and Teacher Education, 5(1), 1-20.
Stofflett, R. T.
(1993). Preservice elementary teachers’ knowledge of rocks and their formation.
Journal of Geological Education, 41,
226-230.
Tamir, P. (1971).
An alternative approach to the construction of multiple choice test items. Journal of Biological Education, 5.
305-307.
Tamir, P. (1990).
Justifying the selection of answers in multiple choice items. International Journal of Science Education,
12 (5), p 563-573.
Treagust, D.F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 10(2), p 159-169.
Trumper, R. (2001). A cross-age study of senior high school students’ conceptions of basic astronomy concepts. Research in Science and Technological Education, 19(1), 97-107.
Trumper, R.
(2001). Assessing students’ basic astronomy conceptions from junior high school
through university. Australian Science
Teachers Journal, 41(1), 21-32.
Trundle, K. C.
(1999) Elementary preservice teachers’ conceptual understandings of the cause
of moon phases. An unpublished dissertation. The
Trundle, K. C.,
Atwood, R. K., & Christopher, J. E. (2002). Preservice elementary teachers’
conceptions of moon phases before and after instruction. Journal of Research in Science Teaching, 39(7), 633-658.
Trundle, K. C.,
Atwood, R. K., & Christopher, J. E.
(2003). Preservice elementary teachers’ conceptions of standards-based
lunar concepts for grades K-4. Paper presented at the Associate of the
Education of Teachers in Science Annual Conference,
Wandersee, J. H.,
& Mintzes, J. J. (1987). Childrens' biology: A content analysis of
conceptual development in the life sciences. In J. Novak (Ed.) Proceedings of
the Second International Seminar on Misconceptions and Educational Strategies
in Science and Mathematics, Vol. 2, (pp. 522-534).
Wandersee, J. H.,
Mintzes, J. J., & Novak, J. D. (1994) Research on alternative conceptions
in science. In D. L. Gabel (Ed.) Handbook on Research in Science Teaching
and Learning. (pp. 177-210).
Weiss, I R.,
Banilower, E.R., McMahon, K.C. & Smith, P.S. (2001). Report of the 2000
National Survey of Science and Mathematics Education. Horizon Research Inc.
Retrieved
Wilson, S.,
Shulman, L. & Richart, A. (1987). 150 ways of knowing: Representation of
knowledge in teaching, in J.Calderhead (Ed.) Exploring Teachers’ Thinking.