THE FLOW LAB: A SIMPLE ACTIVITY FOR GENERATING NOS PRINCIPLES
Daniel Z. Meyer, Illinois Institute of
Technology
Leanne M. Avery, State University of New
York College at Oneonta
The Problem of Nature of Science
While there is near universal agreement within the science
education community that a strong understanding of the nature of science (NOS)
– both as a student goal and a teacher attribute - is critical, we still
struggle with how to achieve these aims.
Barriers range from pragmatic logistics to fundamental curricular
tensions. In this paper, we share a
classroom activity designed to aid in the learning of key NOS principles that
we have found constructive in overcoming such barriers.
Principles of the Nature of Science
Researchers
in science education have settled on several tenets that depict the nature of
science or NOS (Lederman 1992;
Abd-El-Khalick, Bell et al. 1998; Akerson, Abd-El-Khalick et al. 2000;
Abd-El-Khalick 2002). These general aspects of
NOS are that scientific knowledge is tentative (subject to change),
empirically-based (based on and/or derived from observations of the natural
world), subjective (theory-laden), necessarily involves human inference,
imagination, and creativity (involves the invention of explanations), and is socially
and culturally embedded. Many of the
national reform efforts have called for infusion of understandings of the
nature of science in science education (American Association for
the Advancement of Science 1993; National Research Council 1996)Science educators have tried both implicitly and explicitly
to incorporate NOS into their courses (Akerson, Abd-El-Khalick et
al. 2000; Abd-El-Khalick 2002) and this has had mixed results. This still challenges
science educators today. That is, to find ways to teach about NOS using
explicit, implicit, and inquiry approaches that will foster teachers’ abilities
to teach it in their own classrooms.
Approaches and Problems in Developing NOS
Recent research has focused on distinguishing between
historical, implicit and explicit approaches in teaching NOS (Khishfe and
Abd-El-Khalick 2002). Historical approaches, utilizing case studies
as a means of exposing students to the nature of science, has had a mixed
record (Solomon, Duveen
et al. 1992). Implicit approaches engage students in inquiry
based activity (e.g., engaging the learners in asking scientifically oriented
questions, collecting and using evidence to address these questions,
formulating explanations and evaluating them in light of alternatives, and
communicating and justifying their proposed explanations (National
Research Council 1996))
with the intention that by doing science, students will gain an understanding
of the nature of science. Some have
criticized this approach for taking understanding of NOS as an affective attribute
rather than cognitive outcome, and therefore argued for explicit discussion of
NOS principles. Approaches including
explicit instruction have shown to be more effective (Khishfe and
Abd-El-Khalick 2002).
In the logistical realm, time and materials are often
barriers. At the systemic level,
misperceptions about inquiry, pressures of accountability for content coverage,
and beliefs centering around the appropriateness of teaching science as it’s
practiced in school science (Cunningham 1995;
Avery 2003)
also impede the implementation of inquiry-based science in classrooms. At the curricular level, inquiry-based
activities that are both complex enough to demonstrate NOS principles and
accessible are difficult to design and implement.
The difficulty in designing activities to demonstrate NOS
principles can be summarized by considering two balancing acts.[1] The problem space – particularly the data
participants will utilize – can range from simple to complex. If the problem space is too simple, the
activity becomes a confirmation lab; if it is too complex, is becomes undoable
in the classroom. Likewise, the
particular assignment or charge given to participants can vary from very
specified to completely open-ended. Too
specified an assignment will be a traditional cookbook lab; too open-ended an
assignment will leave students not knowing where to go. We have found utility
in an activity that addresses the issues surrounding implementing inquiry as
well as those related to curricula design.
The Flow Lab
We first present the barebones Flow Lab activity, and then
discuss how it might be implemented.
The Basic Activity
The Flow Lab activity originated as an activity for modeling
the dynamics of a water tower (Carlsen and
Trautmann 2004). The basic activity uses the following simple
apparatus: The bottom of an empty
plastic bottle is cut off, and a small hole is drilled in the cap. The bottle is held upside down on a ring
stand. A known amount of water is poured
into the bottle while the hole is held shut.
The hole is then released for ten seconds, and the amount of water that
flows out is measured. Students are then
instructed to experiment with different starting volumes in order to gain an
understanding of the relationship between starting volume and collected
volume. They are told that they will be
challenged to predict the outflow volume for a new starting volume.
What appears to be a straight-forward (and somewhat traditional)
lab activity then generates experiences around key NOS activities. We will discuss these by dividing the
experience into three phases: data collection, data analysis and sharing, and
explicit reflection.
Data Collection
Participants are immediately faced with a situation not
experienced in traditional “cookbook” lab exercises: while they have been given
the basic protocol for measurement, they must establish the specific
implementation to acquire the necessary data.
More specifically, they must decide the number, range and distribution
of test points. There is no external
authority directing this choice.
However, neither are the participants working in a vacuum, where anything
is equally valid. They have the pressure
of an explicit challenge: effectively predict the outflow for a starting value
they have not tried. This will motivate
their argumentation and resolution on what test range to use.
Besides the data range, participants flesh out other details
of the procedures. Routines will be established
such as who does what tasks, how timing is coordinated, etc. These are not necessarily conscious
decisions, but rather represent the development of tacit skills around this
research agenda.
At some point, participants will need to grapple with the
occurrence of potential anomalies.
Results will occur that vary from the participants come to expect. They will then have to decide how to respond:
should it be rejected as an anomaly?
Should the trial be repeated? As
discussed below, there is no external arbiter.
For this portion of the activity, the instructor’s role is
as a facilitator, and a fairly passive one.
For many questions participants pose, the instructor can refer back to
the basic challenge, and impress on them the need for the group to make choices
in light of that challenge. For example,
when participants as how many data points are necessary, the instructor can say
that that is up to them, so long as they meet the challenge of being able to predict
the outflow.
Data Analysis and Sharing
Analysis of the resulting data can proceed in two phases:
within the group for the purpose of addressing the challenge and sharing
between groups. Appendix I shows
examples of participant data. As we
discuss below, the variety seen in the data sets is crucial to the NOS
experience.
As with the section, participants will often behave in the
mode of doing a traditional lab – they will often ask if their results are
correct. The instructor can turn
participants’ focus first to the general question of conclusions about the
relationship between starting volume and outflow, and then to the specific
challenge of predicting an outflow.
Since these two questions are in essence the same question – making a
prediction requires a general conclusion about the relationship – the
instructor can move back and forth between the two to respond to different
participant reactions. For example, some
participants may be reluctant to make a general conclusion, or simply be unused
to being asked such an open question.
The instructor can therefore shift to asking the group how they would go
about making a prediction. This might
prompt participants to identify a best-fit curve as the tool they would use,
which essentially reifies their general conclusion.
Alternatively, the instructor can also ask specific
questions about trends, outliers, etc. in order to bring them to the
participants’ attention. For example,
with Data Set 5, participants could be queried about the data point at a
starting volume of 500ml. (Note that the
two circled data points did not exist during this initial analysis.) Is this point an outlier? Is the phenomenon following a smooth
curve? The ambiguous nature of the data
means that the resulting discussions around these questions are not trivial, but
represent genuine scientific argumentation.
The choice of what specific starting value to give to each
group to make a prediction for can be used in a very calculated way to prompt
certain deliberations on the part of participants. For example, again with Data Set 5, giving a
challenge of predicting the out flow with a starting volume of 525ml was used
as a way of forcing a conclusion about where the 500ml data point was an
outlier and if the phenomenon was following a smooth curve. Questioning the scope of conclusions is
another common issue that can be raised at this point. In Data Set 2, the participants were given a
challenge point outside the general range of their data. In addition, this forced consideration of
whether or not the curve was asymptotic or continue to rise.
Beyond analysis within one data set, comparing data between
groups can raise additional issues, and lead to additional debates. Having groups use different bottle types and
different hole sizes helps insure this variation. While some data sets might appear quite
linear, others will display more of a logarithmic curve. Others still will exhibit some plateau
phenomena.
Sharing data sets in the order they are shown in the
Appendix is an example of how the instructor can foster critical debate. An issue that arose in some groups but not
other was whether there was a plateau phenomena. Therefore, when Data Sets 1 and 2 were
shared, those groups considered any slight dips as anomalies, and this position
was supported by the class as a whole.
However, when Data Set 3 was presented, that group considered the dip to
be real, and argued as much. As
additional data sets were shared, this position grew in acceptance.
As arguments for various conclusions are made, participants
will also make restrictions on their claims in order to gain more
acceptance. In particular, arguments
will be made that data sets such as 1 and 3 are not in contradiction, but
rather represent two different areas of the phenomenon. Besides the plateau question, the long-range
nature of the phenomenon (asymptotic, linear, logarithmic) is another common
point of dispute. Participants will
often make reference to external theories, such as arguing that there is a
limit to how much water molecules can be compressed.
Participants will also make references to the differences in
bottles. They will suggestion, or the
instructor can probe for, possible follow-up experimentation to clarify some of
the ambiguity caused by the differences in bottles and holes, and otherwise explore
more the phenomenon. This variation in
experimental setup might seem like a poor experimental design – and to some
extent it is. However, by having the
problems emerge in this manner the importance of holding variables constant
becomes apparent rather than just be an isolated mandate by the teacher
authority.
Explicit Reflection
After deliberation over the flow data itself, discussion can
turn to more explicit, reflective consideration of what happened. This can also include references to various NOS
related readings. Thus this activity
provides the opportunity to make explicit their implicit experience of NOS
concepts and personal their academic study of NOS concepts.
The most significant concept generated by the Flow Lab
activity is the occurrence of interpretive flexibility. The data that is generated, both within a
single group and between groups, is ambiguous enough that multiple conclusions
are possible (and arguable). The need to
adopt a position is driven by the challenge to make a prediction. This phenomena mirrors the research in
gravitational waves reported by Collins (1975; 1985) and neutrino detection
reported by Pinch (1981; 1985).
Interpretive flexibility is tightly connected with the
concept of experimenter’s regress (Collins 1985): when a new phenomenon is
being studied, no external arbiter exist to distinguish whether unexpected
results are due to a mistake in the theory the prediction is based on or a flaw
in experimental technique. This is
precisely the position participants find themselves in (and never do in
confirmation labs). A surprising data
point may be anomalous, or may be revealing the trend they are trying to find.
The interpretive flexibility in turn prompts significant
argumentation and negotiation among the participants around different
scientific conclusions. This reflective
discussion can be used to explore all the nuances of that process: how
restrictions are added or removed from claims to make them more acceptable (Latour and Woolgar
1986);
how outside references are utilized (Toulmin 1964; Latour
1987);
how presentation can effect audience response (Collins 1975; Tufte
1997). This can all be viewed as part of the broad
process of social construction of knowledge, and provides a way out of the
experimenter’s regress. In particular, a
contrast can be shown between groups comparative lack of confidence when only
looking at their own data, and how that confidence changes as they interact
with other groups. This reflects the
process of closure described by Collins (1981) and numerous other
sociologists (see for example Latour
and Woolgar 1986; Bijker 1987; Cowan 1987; Kline and Pinch 1996).
While taken for granted during the activity itself, the
concept of a black box – a conceptual or physical tool that is utilized without
any challenge to its validity – can be explicitly discussed afterwards. Participants both make significant use of
pre-existing black boxes (e.g. graduated cylinders, the concept of a line of
best fit) and black box new concepts during the course of the activity (e.g.
routines for running trials, “plateauing” as a reference to the data flattening
out for a range). Furthermore, it can be
pointed out how in the face of disputes, some areas of the experimental process
were more likely to be questioned than others.
This is comparable to Pinch’s (1985) description of how competing
researchers would look for weak points in the inference chain to attack
neutrino data results.
Finally, the participants work can be seen as occurring
within a preexisting framework or paradigm.
They have a degree of expectation about the results that at the very
least includes it being a continuous function that generally increases and
never decreases as the input is increased, and in some cases is restricted to a
linear relationship. Adherence to a
paradigm can be seen in participants’ conclusions and arguments, just as
described by Kuhn (1970).
Instructor Role
While we present this activity as one inherently rich in NOS
principles, we also want to emphasize the importance of the instructor in
maximizing the activity's benefit. It
takes an instructor who, besides having a strong understanding of NOS
themselves, has a keen sense of how and when to interact with students in order
to positively effect their experience.
This is an instance where the traditional student centered vs. teacher
centered dichotomy is insufficient, and even counter productive. We agree whole heartedly with the spirit of
student centered learning in so far as students must be engaged in substantive,
active work, and teachers must avoid being sources of simple, unchallengeable
information. But this does not mean that
students should be placed on auto-pilot with a hands-off policy on the part of
teachers. In addition, given that
intervening too overtly and dominantly is the overwhelming tendency for
teachers, perhaps we should consider waiting and holding back as an active move
on the teacher’s part.
Discussion
Spectrum of Implementation
Our development and reporting of this approach is based on
utilizing the activity in a variety of settings over multiple iterations. It has served as the introduction to the
nature of science in elementary and secondary methods and curriculum design
classes. It has been incorporated into
science content classes. We have used it
as a workshop activity for inservice science teachers and NSF GK12 graduate
Fellows.
The basic activity is adjustable to participants’ varying
needs. For example, the context of
modeling water towers is included with elementary teachers to provide
purposefulness for the activity, while secondary science teachers tend to be
more comfortable with a more abstract experiment. In general, the accessibility of the activity
is a key element for overcoming elementary teachers’ aversion to science
whereas the ambiguity of the results is a key for breaking secondary science
teachers preconceptions about lab activities.
Benefits
We see two major advantages to this approach. First, the activity provides a mechanism to
essentially couple the historical, implicit and explicit approaches. Concepts that can be explicitly discussed and
illustrated through case studies are also made concrete through the students’
experience. Second, the activity
effectively achieves a balance point on the axes described above. The data generated are sufficiently complex
and ambiguous to generate experiences of NOS principles. At the same time, the activity is relatively
cheap, quick, and accessible to a wide range of students. The charge to students is both concrete and
significant.
We see a number of additional appeals. The activity:
The essential element in the activity is in the phenomenon
itself, and the data it tends to generate.
As can be seen in the example data sets, the Flow Lab will generate data
that is sufficiently ambiguous that interesting discussion can occur. Providing students with an experience in
which they participate in the process of making raw, ambiguous data into
scientific conclusions requires access to interesting (i.e. messy) data. The Flow Lab’s benefit is in providing
predictably unpredictable results.
Appendix
The following are examples of participants’ data from the
flow lab activity. The different groups
used a variety of bottle types and hole sizes.
In most diagrams, participants have also labeled their predicted outflow
volume and actually outflow volume for the starting volume the instructor
provided.
Data Set 1
Data Set 2
Data Set 3
Data Set 4
Data Set 5
Data Set 6
References
Abd-El-Khalick,
F. (2002). The influence of a philosophy of science course on preservice
secondary science teachers' views of nature of science. Annual Meeting of
the Association of Educators of Teachers of Science.
Abd-El-Khalick,
F., R. L. Bell, et al. (1998). "The nature of science and instructional
practice: Making the unnatural natural." Science Education 82(4): 417-436.
Akerson, V., F.
Abd-El-Khalick, et al. (2000). "Influence of a reflective explicit
activity based approach on elementary teachers conceptions of nature of
science." Journal of Research in Science Teaching 37(4): 295-317.
American
Association for the Advancement of Science (1993). Benchmarks for science
literacy. New York, Oxford University Press.
Avery, L. M.
(2003). Knowledge, Identity, and Teachers' Communities of Practice. Education.
Ithaca, Cornell University: More
than mine.
Bijker, W.
(1987). The social construction of Bakelite: Towards a theory of invention. The
social construction of technological systems. W. E. Bijker, T. P. Hughes
and T. J. Pinch. Cambridge, Mass, MIT Press:
159-187.
Carlsen, W. S.
and N. M. Trautmann (2004). Watershed Dynamics. Arlington, VA, NSTA
Press.
Collins, H. M.
(1975). "The seven sexes: A study in the sociology of a phenomenon, or the
replication of experiments in physics." Sociology 9: 205-224.
Collins, H. M.
(1981). "Stages in the empirical programme of relativism." Social
Studies of Science 11: 3-10.
Collins, H. M.
(1985). Changing order: Replication and induction in scientific practice.
Chicago, The University of Chicago Press.
Cowan, R. S.
(1987). The consumption junction: A proposal for research strategies in the
sociology of technology. The social construction of technological systems.
W. E. Bijker, T. P. Hughes and T. J. Pinch. Cambridge, Mass, MIT Press: 261-280.
Cunningham, C. M.
(1995). The effect of teachers' sociological understanding of science on
classroom practice and curriculum innovation. Ithaca, NY, Cornell University.
Vol 56, no 7, January 1996, DAI
Order Number DA9538839
Khishfe, R. and
F. Abd-El-Khalick (2002). "Influence of Explicit and Reflective versus
Implicit Inquiry-Oriented Instruction on Sixth Graders' Views of Nature of
Science." Journal of Research in Science Teaching 39(7): 551-578.
Kline, R. and T.
Pinch (1996). "Users as agents of technological change: The social
construction of the automobile in the rural United States." Technology
and Culture 37: 763-795.
Kuhn, T. S.
(1970). The structure of scientific revolutions. Chicago, University of
Chicago.
Latour, B.
(1987). Science in action. Cambridge, MA, Harvard University Press.
Latour, B. and S.
Woolgar (1986). Laboratory life: The construction of scientific facts.
Princeton, NJ, Princeton University Press.
Lederman, N. G.
(1992). "Students' and teachers' concpetions about the nature of science:
A review of the research." Journal of Research in Science Teaching 29: 331-359.
National Research
Council (1996). National science education standards. Washington,
National Academy Press.
Pinch, T. (1981).
"The Sun-set: The presentation of certainty in scientific life." Social
Studies of Science 11: 131-158.
Pinch, T. (1985).
"Towards an analysis of scientific observation: The externality and
evidential significance of observational reports in physics." Social
Studies of Science 15: 3-36.
Solomon, J., J.
Duveen, et al. (1992). "Teaching About the Nature of Science through
History: Action Research in the Classroom." Journal of Research in
Science Teaching 29(4): 409-421.
Toulmin, S.
(1964). The Uses of Argument. Cambridge, Cambridge University Press.
Tufte, E. R.
(1997). Visual and Statistical Thinking: Displays of Evidence for Making
Decisions. Cheshire, CT, Graphics Press LLC.
[1] This can bee seen as a summary of the somewhat more general problem of designing inquiry based education.