1 · web view2. the distributed practice mathematics pedagogy in classes-teachers reported...

78
S.MA.R.T.S. (Science and Math Avenues to Renewed Teachers and Students) END OF PROJECT REPORT PIs John Dunkhase and Walter Seaman University of Iowa We include the four MSP-recommended sections in this report. I. Cover Sheet………………………………………… Page 2 II. Executive Summary………………………………. Page 3 III. Project Performance (and Budget)……………… Page 7 IV. Supplemental Information (and Appendices)….. Page 25 I. COVER SHEET 1

Upload: lamdieu

Post on 19-Mar-2018

217 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

S.MA.R.T.S. (Science and Math Avenues to Renewed Teachers and Students)

END OF PROJECT REPORTPIs John Dunkhase and Walter Seaman University of Iowa

We include the four MSP-recommended sections in this report.

I. Cover Sheet………………………………………… Page 2II. Executive Summary………………………………. Page 3III. Project Performance (and Budget)………………Page 7IV. Supplemental Information (and Appendices)….. Page 25

I. COVER SHEET

1

Page 2: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

2

Page 3: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

II. EXECUTIVE SUMMARYSMARTS (Science and Math Avenues to Renewed Teachers and Students) was a three-year (2005-2008) Iowa Department of Education and Iowa Board of Regents Mathematics and Science Partnership grant funded through Title II funds in the amount of approximately $450,000 (PIs John Dunkhase, University of Iowa Clinical Associate Professor UI Teaching and Learning, and Walter Seaman, University of Iowa Associate Professor of Mathematics and Associate Professor of Teaching and Learning). The staff included nine Grant Wood Area Education Agency (GWAEA) science and mathematics consultants. There were 42 participant elementary school teachers from 12 GWAEA schools in 6 LEAs (at the end of the three years).

The SMARTS project goals were to: 1. enhance science and math instruction through an inquiry approach, 2. improve student achievement and 3. nurture collaboration among participating teacher teams. Work toward these goals was implemented in each of the three years by a combination of science and mathematics content and pedagogy-focused summer institutes and four academic year follow up seminars, along with a Lesson Study approach implementation vehicle. On-site staff support for content and lesson study process was also employed during the school year. A SMARTS web board was used for communications and discussion topics, and there was a sharing of learning in annual day-long Spring meetings.

Project Evaluation and ImpactSMARTS employed mixed-method, quasi-experimental design evaluation. See for example the SMARTS Year 2 Annual Progress Report (APR), section 7 Evaluation Components for additional details.

As part of the SMARTS evaluation work, analyses of SMARTS Survey of Enacted Curriculum (SEC) data and student ITBS data were conducted by the University of Iowa Center for Evaluation and Assessment and the Educational Measurement and Statistics program. We include outcomes from those analyses in this report.

In the other part of the evaluation work SMARTS project staff analyzed a variety of outputs. Staff considered teacher instructional practice and content knowledge by analysis of teacher-submitted student work consisting of Quick Write and Distributed Practice quiz responses and teacher scoring rubrics, science/mathematics notebooks/journal submissions, teacher constructed SMARTS web board discussions, science and mathematics unit story lines, concept maps, Lesson Study documentation and reports, Survey of Enacted Curriculum (SEC) data, and a Zoomerang on-line survey data on teaching science and mathematics (administered in February 2007). Additional teacher and administrator survey instruments and interviews during the project focused on of administration support, practice and implementation and participant reactions to SMARTS workshops. Outcomes of these analyses were included in the Annual Progress Reports and we include concise statements about those outcomes also in this report.Below we first give a summary and our reflections on the overall outcomes of SMARTS. After that we give summaries of results from several SMARTS evaluation components.

OVERALL SUMMARY AND REFLECTIONSWe believe, and give evidence in this report, that SMARTS had a positive impact on helping teachers implement an increased level of inquiry and exploration based approaches to mathematics and science teaching and learning. Different teachers appeared to be implementing these approaches with varying levels of intensity, but there was noted growth in these areas in many of the participants.

We believe, and give evidence in this report, that SMARTS had some of its most visible positive impact in the incorporation and adoption of the Lesson Study professional development model by several entire SMARTS schools, and by one SMARTS school entire district. In these cases the SMARTS staff was informed that the experiences of the teachers in the SMARTS Lesson Study training and implementation was an important component of the decision to implement Lesson Study on a wider scale.

We believe, and give evidence in this report, that SMARTS had a positive impact on student achievement in mathematics and science achievement. There are a number of measures of such impact including teacher and project-constructed instruments (such as Distributed Practice questions and quizzes and Quick Write assignments), discussed in the Project Performance section. We have initial analysis of ITBS math and science data of students from SMARTS schools and continue to analyze it to uncover information and/or trends in the data. We note that the

3

Page 4: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

average math and science ITBS scores in classrooms with a SMARTS teacher were consistently higher than the average ITBS scores in classrooms where the teacher did not participate in the SMARTS program. This is also true for the base-line year 2004-2005, before SMARTS started. On the other hand the changes in mean ITBS math and science scores from year to year for the SMARTS teachers' students and the non SMARTS teachers' students did not reveal more improvement over time for SMARTS compared with non SMARTS teachers' students. Hence these data on changes to means do not support the conclusion that SMARTS had a measurable positive impact on the ITBS scores. So the question we are now further investigating is whether finer disagreggations or analyses can identify SMARTS impact in a clearer way. We are in the process of doing these additional analyses now with the assistance of the University of Iowa Educational Measurement and Statistics program and a graduate research assistant from the College of Education.

In the sections below we give a bit more detailed summaries of results from several SMARTS evaluation components.

SUMMARY OF RESULTS on ITBSWe collected science and mathematics ITBS student scores, with no student names attached to any scores, for all the students at schools participating in SMARTS for each one of the four academic years 2004-2005, 2005-2006, 2006-2007 and 2007-2008. The ITBS scores collected were those for the Science, Math Concepts and Estimation, Math Problem Solving and Data Interpretation and Math Computation tests in the ITBS instruments. Those scores were disaggregated only by teacher names and grades. With these data we were able to compare the differences in mean student scores on these tests for SMARTS vs. non SMARTS teachers during the 2005-2008 academic years.

The ITBS analysis by the University of Iowa Educational Measurement and Statistics program shows that average ITBS scores in classrooms with a SMARTS teacher were consistently higher than the average ITBS scores in classrooms where the teacher did not participate in the SMARTS program. The direction of the difference was consistent for all 12 of the comparisons, however in only 8 of the comparisons was the difference large enough to be significant using alpha=.05.

Mean Differences (and p-values) between ITBS scores in classrooms taught by SMARTS teachers and those taught by teachers not participating in the SMARTS program

School Year Science Math Concepts Math Problem Solving

Computation

2004-05 4.74 6.51 7.38 2.85(.10) (.01) (.003) (.02)

2005-06 6.97 3.88 5.39 3.67(0.03) (0.12) (0.05) (0.08)

2006-07 4.97 6.36 6.25 5.59(0.14) (0.03) (0.14) (0.03)

2007-08 3.44 5.25 5.51 3.24(0.28) (0.05) (0.09) (0.07)

We note that there are higher ITBS math and science mean scores of SMARTS teachers' students compared with non SMARTS teachers' students, even in the base line year 2004-2005 before SMARTS started. Examination of the changes in mean ITBS math and science scores from year to year for the SMARTS teachers' students and the non SMARTS teachers' students did not reveal more improvement over time for SMARTS compared with non SMARTS teachers' students. Hence these data on changes to means do not support the conclusion that SMARTS had a measurable positive impact on the ITBS scores. We include additional discussion of these results in the next sections. We include a copy of the ITBS analysis by the University of Iowa Educational Measurement and Statistics program as in the Appendices.

We will continue to analyze the ITBS data at finer levels by employing additional disaggregations in order to determine if subcategories within the data might reveal any additional significant differences between SMARTS and

4

Page 5: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

non-SMARTS teachers' students' achievement on these instruments. We are currently working with a graduate assistant in the College of Education to continue the analyses.

SUMMARY OF RESULTS on Quick Write, Distributed Practice, Lesson Study and ZoomerangVery concisely, the project staff analyses of these data by in large led the PIs to the conclusions that in the many but not all of the participants, there was growth and improvement in the areas of science and mathematics inquiry-based teaching, as well as in teachers' abilities to observe student work and make effective use of student in-class conversations and writings about science and mathematics topics. Most teachers indicated at several points during the project that they thought DP was a highly effective methodology for raising students' level of involvement in doing and communicating mathematics. And most teachers indicated that they thought the Lesson Study process was a highly effective methodology for improving their own content understanding, for improving their teaching of content and for enhancing their own 'learning communities'. Teachers also reported increased confidence and enthusiasm for the teaching of science and mathematics as a result of their participation in SMARTS.

SUMMARY OF RESULTS on SECThe SEC instruments in science and mathematics were administered pre/post project: (summer 2005 (SCIENCE N = 38, MATH N = 37) AND fall 2008 (SCIENCE N =15, MATH N = 23). Teachers chose math or science depending on their teaching emphasis, and were allowed to complete both instruments if appropriate.There were a total of 38 participants who completed the mathematics SEC in either 2005 or 2008. Of those 38, there were 22 participants who had taken the mathematics SEC in both years. There were a total of 39 who completed the science SEC in either 2005 or 2008. Of those 39, 14 participants had taken the SEC in both years.

Our analyses of the 2005 data (included in SMARTS year 1 Annual Progress Report) described baseline data along with our expectations and hopes for changes by the end of the project. Our hope and expectation was that such training would lead to SMARTS teachers' students spending a higher percentage of total time in science and mathematics on inquiry-based, student-centered classroom practice: making connections, analyses, conjectures, demonstrate understandings, proving and problem-solving, and less on practicing procedures, recall of memorized facts and making measurements.In our analyses we compared the overall responses of all respondents in mathematics and in science. This may present a problem in comparing different overall groups (which include some of the same respondents) we felt that it does give some indication of disposition toward science and mathematics of the entire SMARTS participant pool. The 2008 SEC results indicate that these expectations and hopes were in many cases met judging from the teachers' responses in 2008. See the Project Performance section of this report for data supporting these contentions.

We also include the SEC analyses performed by the University of Iowa Center for Evaluation and Assessment. Those analyses compare responses only of SMARTS participants who took both the 2005 and the 2008 surveys. It also includes more detail on the teachers' responses to questions related to emphases within science and mathematics in their professional development experiences.

We believe that parts of the CEA analyses support many of the contentions made in our analyses. For example, in the CEA analyses one finds teachers reported a marked increase from 2005 to 2008 in percent of time students spent in science classes on the activities “Design investigation to solve a scientific question” and “Change variable in an experiment, test hypothesis” and a marked decrease from 2005 to 2008 in percent of time spent in science classes on the activity on “Follow step-by-step directions”. CEA analyses also one finds teachers reported a marked increase from 2005 to 2008 in percent of time students spent in mathematics classes on the activities “Explain their reasoning in solving a problem by using several sentences orally or in writing “ on “Complete or conduct proofs or demonstrations of mathematical reasoning” and “Display and analyze data”. These changes are consistent with the SMARTS inquiry- and exploration-based and technology rich design for teacher content and pedagogy training. We will point out additional results in the next sections.

SUMMARY OF RESULTS on SMARTS Impact Outside of SMARTSWe include additional details on these results in the Project Performance Section III. We note that the Lesson Study training in SMARTS has played an important and positive role in the implementation Lesson Study as the chosen PD model district-wide in the College Community School District, as well the school-wide PD model at Penn Elementary and Van Allen Elementary in the Iowa City Community School District. In all these examples SMARTS staff have been assisting in this implementation.

5

Page 6: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Another setting in which SMARTS has had impact beyond the SMARTS participants has occurred in the uses of SMARTS information to inform University of Iowa mathematics classes for pre service teachers, and information from University of Iowa mathematics classes for pre service teachers to inform the SMARTS project. This collaborative impact, tying university content and pedagogy expertise with school teacher, consultant and administrator expertise, has served as a topic of several talks given by the PIs at mathematics and teaching conferences.

6

Page 7: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

III. Project Performance

We include the following sections below.

Introduction-Project BackgroundPart 1. PROJECT EVALUATION MATERIALSPart 2. SEC 2005/2008 RESULTS AND ANALYSES

Subsection 1 SMARTS staff analysesSubsection 2 CEA Analyses

Part 3. ITBS 2004, 2005, 2006, 2007, 2008 RESULTS AND ANALYSES Part 4. LESSON STUDY IN SMARTS Part 5. SMARTS IMPACT OUTSIDE OF SMARTS Part 6. SMARTS CUMULATIVE BUDGET SUMMARY

Introduction-Project BackgroundSMARTS (Science and Math Avenues to Renewed Teachers and Students) was a three-year (2005-2008) Iowa Department of Education and Iowa Board of Regents Mathematics and Science Partnership grant funded through Title II funds in the amount of approximately $450,000. The Principal Investigators were John Dunkhase, Clinical Associate Professor UI Teaching and Learning, and Walter Seaman, Associate Professor of Mathematics and Associate Professor of Teaching and Learning. The staff included nine Grant Wood Area Education Agency (GWAEA) science and mathematics consultants. The participants included, by the end of the third year, 42 elementary school teachers from 12 GWAEA schools in 6 LEAs. The entire project staff delivered over 100 hours per year of inquiry-based science and mathematics content and pedagogy professional development and lesson study training with on-site lesson study implementation assistance.

There were approximately 1300 students who were students of teachers who participated in SMARTS. There were 42 Grant Wood AEA elementary teachers in SMARTS at the end of the three-year project. The from 6 districts and 12 schools involved are listed below.

Districts Elementary Schools

Iowa City Kirkwood, Penn, Van Allen

College Community Prairie Heights, Prairie Ridge

Cedar Rapids Erskine, Johnson, Madison, Truman

Washington Lincoln

Anamosa Strawberry Hill

Vinton-Shellsburg Shellsburg

Participating teachers earned stipends at $110 per diem rate, 8.5 days per year for three yearsand additionally received University of Iowa College of Education graduate credits (3 graduate credit hours/summer and 2 graduate credit hours/academic year) at a reduced rate of 10% of the standard cost (reduced rate was about $35 per credit). Additional funds were spent to pay for teacher substitutes for four days per year of the grant, for each of the teacher participants (with an assumed substitute cost of $115/day). During those four days the planning for and implementation of Lesson Study research lesson cycles were accomplished by the participants.

The SMARTS project goals were to: 1. enhance science and math instruction through an inquiry approach, 2. improve student achievement and 3. nurture collaboration among participating teacher teams. Work toward these goals was implemented by a combination of the following activities. Summer Institutes (8.5 days per year ) with science and mathematics inquiry-based content and pedagogy focus and lesson study training, ongoing Lesson-Study Groups (10 at the end of the third year) during the academic years , with on-site implementation assistance

7

Page 8: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

from GWAEA Science and Math Consultants, UI science education faculty and UI mathematics faculty, four (4) academic seminars throughout the school year, online discussion groups via a web board, and a sharing of learning at annual Spring Symposium in Years 1 and 2, and a final SMARTS Lesson Study Conference at the end of Year 3 open to the academic, education and policy communities

Part 1. Project Evaluation MaterialsProject staff evaluated teacher instructional practice and content knowledge by analysis of teacher-submitted student work consisting of Quick Write and Distributed Practice quiz responses and teacher scoring rubrics, science/math notebooks/journal submissions, teacher constructed SMARTS web board discussions, science and mathematics unit story lines, concept maps, Lesson Study reports and documentation, Survey of Enacted Curriculum (SEC) data, a Zoomerang on-line survey data on teaching science and mathematics, and additional teacher and administrator survey instruments and interviews during project focused on of administration support, practice and implementation. This information was used in constructing staff feedback to teachers about effective uses of these pedagogies in their classes. Examples include staff-constructed DP and QW data templates for questions and scoring rubrics. Additional details were provided in the SMARTS Year 1 APR Appendices A18 and A19. Impact of uses of such templates were detailed in the SMARTS Year 2 APR Appendices A9 and A10. DP and QW information was also included in the Year 3 on-line APR.

This information was used in constructing staff feedback to teachers about science and mathematics content and effective uses of SMARTS-delivered pedagogies in their classes. Examples of such feedback include staff-constructed DP and QW data templates for questions and scoring rubrics. Additional details were provided in the SMARTS Year 1 APR Appendices A18 and A19. The impact of uses of such templates were detailed in the SMARTS Year 2 APR Appendices A9 and A10. DP and QW information was also included in the Year 3 on-line APR (see part VII Program Evaluation Part G Impact on Teachers).

Feedback from project staff concerning teachers' implementation of the Lesson Study process were provided in Academic Year Seminar meetings and during Lessons Study group meetings in which project staff participated in a consulting capacity. Additional lesson study implementation information was also included in the Year 3 on-line APR (see part VII Program Evaluation Part G Impact on Teachers, included below in Part 5. SMARTS IMPACT OUTSIDE OF SMARTS ).

Very concisely, the project staff analyses of these data by in large led to conclusions that in the many but not all of the participants, there was real growth and improvement in the areas of science and mathematics teaching, as well as in teachers' abilities to observe student work and make effective use of student in-class conversations and writings about science and mathematics topics. Most teachers indicated that they thought DP was a highly effective methodology for raising students' level of involvement in mathematics. And most teachers indicated that they thought the Lesson Study process was a highly effective methodology for improving their own content understanding, for improving their teaching of content and for enhancing their own 'learning communities'. Teachers also reported increased confidence and enthusiasm for the teaching of science and mathematics as a result of their participation in SMARTS on the Zoomerang survey administered in February 2007.One concrete manifestation of the impact of the SMARTS project may be the SMARTS-influenced adoption of Lesson Study as the district professional development model in the College Community School district starting in the last year of SMARTS, as well as the school-wide adoption of the Lesson Study professional development model by two SMARTS schools (Penn Elementary and Van Allen Elementary). We give more detail about these points in Part 5. SMARTS IMPACT OUTSIDE OF SMARTS.

Zoomerang:An on-line Zoomerang survey was made available to the SMARTS participants in February 2007, as an additional formative assessment instrument for this project. Information about the results were reported in the Year 2 APR

8

Page 9: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Appendix A11. It was completed by 18 of the 42 participants. There we 24 Lickert-scale questions participant attitudes and opinions on the depth of the math and science content delivered in the SMARTS Project, principal/administrative support for the SMARTS work, teacher confidence in teaching math and science, goals of SMARTS, etc. Concisely our interpretation of the results of the Zoomerang is that good gains were made in the math and science confidence and enthusiasm categories. A good number of participants moved from neutral or lower very or extremely confident and enthusiastic for both math and science. Below is a copy of the Zoomerang tallied results.

Mathematics (n=20)

Level Confidence EnthusiasmBefore % After% ∆ Before% After% ∆

1- Not at all 5 0 0 02 - Not very 10 0 0 03 – Neutral 45 20 40 154- Very 35 70 2X 50 505- Extremely 5 10 2X 10 35 3.5X

Science (n=20)

Level Confidence EnthusiasmBefore% After% ∆ Before% After% ∆

1- Not at all 5 0 0 02 - Not very 10 5 5 03 – Neutral 50 10 45 54- Very 25 25 40 505- Extremely 5 25 5X 5 40 8X

We include below SMARTS staff interpretations of the February 2007 Zoomerang survey results. These are quotes from SMARTS staff emails.

"Interesting results on Math and science before and after question (16-23) from SMARTS Project Zoomerang survey."

"Looks like we made good gains in the math and science confidence and enthusiasm categories. It looks like we had a lot of participants move from neutral or lower very or extremely confident and enthusiastic for both math and science (see attached table). Yea! team!"

"The principal involvement is the only thing that doesn't look quite as good - although 70% of participants think we're doing well or very well here but 30% think neutral or or no well. Still not too bad …"neutral or lower very or extremely confident and enthusiastic for both math and science."

Part 2. SECSubsection 1 SMARTS staff analysesThe Survey of Enacted Curriculum (SEC) is a product of the Council of Chief State School Officers Wisconsin Center for Education Research. A url for information about this SEC ishttp://seconline.wceruw.org/SECwebReference.asp The SMARTS teachers were asked to complete a science and/or mathematics SEC form (including roughly 250 questions, taking about two hours) in the summer of 2005 (SCIENCE N = 38, MATH N = 37 ) and again in the fall of 2008 (SCIENCE N = 15, MATH N = 23 ). Stipends of $50 for the first and $100 for the second administration were offered.

9

Page 10: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Analyses of 2005 SEC results were conducted by SMARTS project staff during the Spring and Summer of 2006 and summary analyses were included in the SMARTS Year 1 Annual Progress Report (Appendices A14 and A15, pages 142-204). We include below our analyses of comparisons between 2005 and 2008 results in two Science categories (Cognitive Engagement and Time on Topic) and two Mathematics categories (Cognitive Demand and Time on Topic).

We also include the SEC analysis performed by the University of Iowa Center for Evaluation and Assessment. That analysis compares responses only of SMARTS participants who took both the 2005 and the 2008 surveys. It also includes more detail on the teachers' responses to questions related to emphases within science and mathematics in their professional development experiences.

We believe that parts of this CEA analysis also supports some of the contentions made in our analyses and will point this out in the next sections. Because many participants in both science and mathematics groups did not take the SEC at both times, we feel that analyses all of the responses is one way to give a picture of the participants’ implementation of and experiences in school science and mathematics. But we point out again that the groups we consider in our analyses are not the same.

We have chosen in our analyses to focus on these four categories because we feel they give the most important markers for possible impact of the SMARTS participation. In the 2005 analyses appearing in the SMARTS Year 1 Annual Progress Report we included additional information on teachers' responses to SEC questions about total time and percentages of total time spent on a variety of components of Professional Development training throughout the year (e.g. Content Focus, Student Learning Focus, etc.). We would be happy to provide additional information about that data in these other categories if it would be useful.

We plan to continue analyzing the SEC data and look for additional SMARTS supporting data for observed changes described here. For example we intend to combine our analyses of SEC with information from student science and mathematics ITBS subscores, from teachers' Lesson Study research lesson reports and additional student artifacts (Distributed Practice quiz information and Quick Write results). With these additional resources we hope to be able to be able to give further scaffolding of the types of impact contentions we make below.

SEC-SCIENCE

SEC SCIENCE COGNITIVE ENGAGEMENT RESULTS

The numbers in the COGNITIVE ENGAGEMENT CATEGORY represent a measure of the percentage of Science instructional time spent on tasks that would result in students' abilities to perform those cognitive demand categories in the content areas which were covered by that teacher.

The chart below with pre and post data on Science Cognitive Engagement results was part of the Survey of Enacted Curriculum data received from WREC on 12-29-08 in an Excel file titled IA 2008 SMARTS K12sciCntVwr-12-29-08.xls on the page named Marginals.

Note that Content Marginals report proportion of time spent on each cognitive demand category and each content area separately.

Note in the bottom right chart the significant increase (.16 to .26) 2005-2008 in the proportion of time sent on cognitive engagement in the Make Connections/Apply category and less (.22 to .20) on the Memorize/Recall category. Analyze information increased 2005-2008 slightly (.16 to .17). These changes are consistent with predictions from 2005 (more details are given below). In the Communicate Understanding category there was a decreased (.22 to .18) which tends to run counter to the hopes and expectations from 2005 sec analysis.

10

Page 11: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

We quote from the SMARTS YEAR 1 APR, pages 179-180 about the SCIENCE SEC (note that in the 2008 instrument the phrase "Cognitive Engagement" appears whereas in the 2005 instrument the phrase "Cognitive Demand" is used for the same categories, emphasis added in bold):

"…the lowest percentage instructional time reported was on the cognitive demand categories “Analyze Information” and “Apply concepts/Make connections" ... By the end of the SMARTS project we hope and expect that this percentage will increase in the total proportion of Science instructional time spent.

Some reasons to expect such an increase are that part of the SMARTS project training focuses on the pedagogy of Quick Writes, which may be applied to Science or Mathematics classroom work. In the SMARTS workshop training the participants learn that that pedagogy include the ideas such as justification, reasoning and discussion as part of the rubric for assessing high-quality student responses.

In this connection we also note that during the school year SMARTS teachers are required to turn in to the SMARTS project staff samples of student QW responses along with their assessments of the quality of the student responses. One of the indicators of a high quality response is then the extent to which the students have included justification as part of their answers…"

There are many reasons which could explain the sizable increase in percent of time which teachers report their students spend on making connections and applying science ideas. The SMARTS project design was to act as a catalyst to such changes through a focus on science and math inquiry and active student learning and documenting strategies had as a goal. On the other hand these SEC results are self-reported teacher data. It might be the case that in some cases teachers' interpretations of what science cognitive demand category work their students are engaged in at a given time would differ from other teachers' interpretations. Also note that the smaller sample size in the 2008 SEC results may have an impact on the reliability of the 2008 data.

On the other hand we do consider it to be a desirable and successful outcome that teachers report that their students spend an increased percentage science class instructional time on making connections and applying what they have learned. As mentioned previously, this increase was one of the goals that informed the design of the project.

It is disappointing that the data does not yield an increase in the percent of science cognitive demand activity on Communicate Understanding. Anecdotes from a significant number of SMARTS teachers about students engaging in conversations about science and mathematics in their classes suggested that this percentage would have increased.

SEC SCIENCE TOPIC COVERAGE RESULTS

The numbers in the TOPIC COVERAGE CATEGORY represent a measure of the percentage of Science instructional time spent on SEC-provided content areas which were covered by that teacher.

The charts below with pre and post data on Science CONTENT COVERAGE results were part of the Survey of Enacted Curriculum data received from WREC on 12-29-08 in an Excel file titled IA 2008 SMARTS K12sciCntVwr-12-29-08.xls on the page named Marginals.

Note in the bottom right chart the sizable increase (.11 to .27) 2005-2008 in the percentage of time sent on the Nature of Science category, in Science and Technology (.02 to .07), in the Animal Biology category (.03 to .11) a slight increase in percentage of time on Ecology (.03 to .04), and a smaller percentage of time (.22 to .12) on the

11

Page 12: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Measurement in Science category. There was also a noticeable decrease (.12 to .05) in the percentage of time spent on Earth Systems.

The following is an excerpt from the Year 1 APR Appendix A15 on Science SEC results, page 203 (emphasis added in bold):

"We note that the SMARTS teacher responses indicate that the greatest percentages of the total Science instruction time throughout the year were spent on Measurement in Science …Nature of Science…, Science/Health/Environment and Earth Systems...

The SMARTS staff will watch with interest the content coverage category distributions for any signification changes. Additionally it will be of interest to the SMARTS staff to see if any of the Science topics discussed in the SMARTS workshops or academic year seminars which currently appear with small percentages of instructional content appear with changed percentages of Science content coverage by the end of the project. Some potential Science content areas which appear as part of the SMARTS focus are: Science & Technology (e.g. SMARTS use of GPS for finding locations, making measurements, etc.), and Ecology (e.g. the SMARTS investigations of water quality in Lake MacBride via analysis of the phosphate, nitrate and algae content)."

There are many reasons which could explain the sizable increase in percent of time which teachers report their students spend on Nature of Science and Science and Technology. The SMARTS project design was to act as a catalyst to such changes through a focus on science and mathematics inquiry and active student learning and documenting strategies had as a goal. On the other hand these SEC results are self-reported teacher data. It might be the case that in some cases teachers' interpretations of what science time on topic categories their students are engaged in at a given time would differ from other teachers' interpretations. Also note that the smaller sample size in the 2008 SEC results may have an impact on the reliability of the 2008 data.

On the other hand we believe it is reasonable to consider it to be a desirable and successful outcome that teachers report that their students spend an lower percentage science class instructional time on Making Measurements in Science in the sense that measurements in and of themselves may represent a 'rote' aspect of the scientific inquiry process. Of course the context of the measurement activities is quite important here, and measurements may represent crucial components in some scientific experiments. But measurement without careful attention to scientific hypotheses, methodology and analyses of claims and evidence, would represent a limited view of the scientific process.

12

Page 13: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

SEC-MATHEMATICS

SEC MATHEMATICS COGNITIVE DEMAND RESULTS

The numbers in the Cognitive Demand category represent a measure of the percentage of mathematics instructional time spent on tasks that would result in students' abilities to perform those cognitive demand categories in the content areas which were covered by that teacher.

The chart below with pre and post data on Mathematics Cognitive Demand results was part of the Survey of Enacted Curriculum data received from WREC on 12-30-08 in an Excel file titled IA 2008 SMARTS K12-MATH-CntVwr-12-30-08.xls on the page named Marginals.

We note in the bottom right chart there are sizable increases 2005-2008 in the proportion of time spent on the three cognitive demand categories Solve non-routine problems/Make Connections (.12 to .19), the Conjecture, Generalize, Prove (.11 to .18) and in Demonstrate Understanding (.19 to .23). There were smaller increases in the categories Memorize/Recall (.17 to .19) and Perform Procedures (.20 to .21).

It is our opinion that the increases observed are consistent with the expectations and hopes discussed in the SMARTS 2005 Year 1 APR report (pages 165-166 and 169-170) about the MATHEMATICS SEC, quoted below (Appendix A14 pages 165-166 and 169-170, emphasis added in bold).

" The SMARTS staff will watch with interest for changes that occur in these statistics as the SMARTS project progresses. By the end of the SMARTS project we hope and expect that the distributions of these percentages number will change to show an increase in the total proportion of mathematics instructional time spent on “Demonstration and/or Explanation” and “Student Activities related to Active Learning”. Some reasons to expect such increases are that the SMARTS focus on active learning strategies such as the discussions associated with distributed practice and questioning strategies are important components of the participants’ workshop experiences.

We would also hope for an increase in the total proportion of mathematics instructional time spent on “Student Activities related to Make connections”. The idea of students making connections is one of the components of the science pedagogy called Quick Writes (QW). Those are short student-written reflections on science content. The SMARTS project integrates QW pedagogy into the workshop training. Part of what constitutes high-quality QW responses from students is their writing about applications and justifications of what they learned in science. In student QWs we hope to see applications and justifications, and evidence of students making connections with mathematics in their science lessons and vice versa… "

(from pages 169-170)

"We also note that the lowest percentage instructional time reported was on the cognitive demand category "Conjecture, Generalize, Prove" ... By the end of the SMARTS project we hope and expect that this percentage will increase in the total proportion of mathematics instructional time spent.

Some reasons to expect such an increase are that part of the SMARTS project training focuses the mathematics pedagogy of Distributed Practice, as well as the science pedagogy of Quick Writes. In the SMARTS workshop training the participants learn that those pedagogies include the ideas such as justification, reasoning and discussion as part of the rubric for assessing high-quality student responses."

As in the Science categories, there are many reasons which could explain the larger increases in percent of time which teachers report their students spend on Solve non routine problems, make connections, Conjecture, Generalize

13

Page 14: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

and Prove, etc. The SMARTS project design was to act as a catalyst to such changes through a focus on science and math inquiry and active student learning and documenting strategies had as a goal. On the other hand these SEC results are self-reported teacher data. It might be the case that in some cases teachers' interpretations of what mathematics cognitive demand category work their students are engaged in at a given time would differ from other teachers' interpretations. Also note that the smaller sample size in the 2008 SEC results may have an impact on the reliability of the 2008 data.

Nevertheless we do consider the larger increases notes to be desirable and successful outcomes that teachers report that their students spend an increased percentage mathematics class instructional time on making connections and applying what they have learned. As mentioned previously, this increase was one of the goals that informed the design of the project. The quote above from pages 169-170 of the SMARTS Year 1 APR cites the implementation of Distributed Practice as a teaching practice which we hope would lead to students engaging in mathematically rich conversations about solution strategies, justifications (proofs) and generalizations. The percentage increases noted above are at least consistent with these aims.

SEC MATHEMATICS TIME ON TOPIC RESULTS

The numbers in the topic coverage category (Time on Topic) represent a measure of the percentage of mathematics instructional time spent on SEC-provided content areas which were covered by that teacher.

The charts below with pre and post data on mathematics Time on Topic results were part of the Survey of Enacted Curriculum data received from WREC on 12-30-08 in an Excel file titled IA 2008 SMARTS K12-MATH-CntVwr-12-30-08.xls on the page named Marginals.

We note in the chart there are small decreases 2005-2008 in the percentage of time spent Number Sense and Operations, and sizable increases in the percentage of time on Geometric Concepts (.10 to .16) , Basic Algebra (.02 to .07), and a slight increase in Data Displays (.06 to .08).

14

Page 15: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

There are many reasons which could explain the sizable increase in percent of time which teachers report their students spend on Geometric Concepts and Basic Algebra. The SMARTS project design was to act as a catalyst to such changes through a focus on science and mathematics inquiry and active student learning and documenting strategies had as a goal. On the other hand these SEC results are self-reported teacher data. It might be the case that in some cases teachers' interpretations of what science time on topic categories their students are engaged in at a given time would differ from other teachers' interpretations. Also note that the smaller sample size in the 2008 SEC results may have an impact on the reliability of the 2008 data.

On the other hand we believe it is reasonable to consider it to be a desirable and successful outcome that teachers report that their students spend an lower percentage science class instructional time on Making Measurements in Science in the sense that measurements in and of themselves may represent a 'rote' aspect of the scientific inquiry process. Of course the context of the measurement activities is quite important here, and measurements may represent crucial components in some scientific experiments. But measurement without careful attention to scientific hypotheses, methodology and analyses of claims and evidence, would represent a limited view of the scientific process.

Quote from the SMARTS Year 1 APR (Math SEC analysis Appendix A14 pages 171-172, emphasis added)

"The SMARTS teacher responses result in the following ordering from greatest percentage of mathematics instructional time to least percentage of mathematics instructional time spent on the following content areas:…

Nbr. Sense, properties, relationshipsOperationsMeasurementGeometric ConceptsData analysis, probability, statisticsAlgebraic ConceptsInstructional Technology

We note that the SMARTS teacher responses indicate that one of the lower percentages of the total mathematics instruction time throughout the year was spent on the content topic "Data analysis, probability, statistics" .

By the end of the SMARTS project we hope and expect that this percentage number will increase. There are a number of reasons why we would we expect such an increase. In addition to Data Analysis and Probability being one of the targeted areas of content focus advocated by the National Council of Teachers of Mathematics 2000 publication Principles and Standards for School Mathematics, we note an important part of the SMARTS project focus is on science kits which the teachers use in their classes. Those topics often involve gathering data for analysis. The SMARTS summer workshop training includes data collection (in an environmental setting, ideas from which are applicable to many of the science kits used in teachers' classrooms), and analysis of the resulting data. The workshop also includes teacher study of various mathematical models of data-multiple representations, measures of central tendency, and the impact of confounding effects influencing ideas of causal connections in data analysis. We hope and expect that this SMARTS training in data analysis will result in a greater emphasis on this topic during the school year.We also note that the SMARTS teacher responses indicate that the lower percentage of the total mathematics instruction time throughout the year was spent on the content topic "Instructional Technology" (2%).By the end of the SMARTS project it is possible that we may see an increase in this percentage. The reason we might see such an increase is that the SMARTS workshop training includes uses of several technologies applicable to teachers' classroom work."

We note that although the increases in percentages of time was small, there were increases 2005-2008 in total mathematics instruction time reported spent on the categories of Data Displays, Statistics, Probability, and Instructional Technology. Those topics were among the ones cited in the Year 1 APR as topics for which we hoped and expected increases to occur. These were desirable outcomes as the topics. They are considered important mathematics content background for students and teachers as documents such as the NCTM Principals and Standards for School Mathematics, the CBMS Mathematics Education for Teachers report and the Iowa Core Curriculum indicate.

We would also like to comment on the observed increases in the reported percentages of time spent on Basic Algebra and Geometric Concepts. Although we cannot claim a causal connection between participants' SMARTS

15

Page 16: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

project work and these increases, we want to mention that those topics were part of the mathematics content activities in a number of the SMARTS Academic Year Seminars and Summer Institutes. Additionally some work on Basic Algebra and on Geometric Concepts was the focus of some of the SMARTS teachers' Lesson Study research lessons on mathematics. We hope and find it credible that such work in SMARTS contributed to these increases. Further investigation of these topics will be part of our ongoing analyses of the full set of SMARTS data collected during the project.

Part 2. SECSubsection 2 CEA analyses

SEC analyses were also performed by the University of Iowa Center for Evaluation and Assessment. Those analyses compare responses only of SMARTS participants who took both the 2005 and the 2008 surveys. It also includes more detail on the teachers' responses to questions related to emphases within science and mathematics in their professional development experiences. The SEC instruments in science and mathematics were administered pre/post project: (summer 2005 (SCIENCE N = 38, MATH N = 37) AND fall 2008 (SCIENCE N =15, MATH N = 23). Teachers chose math or science depending on their teaching emphasis, and were allowed to complete both instruments if appropriate.There were a total of 38 participants who completed the mathematics SEC in either 2005 or 2008. Of those 38, there were 22 participants who had taken the mathematics SEC in both years. There were a total of 39 who completed the science SEC in either 2005 or 2008. Of those 39, 14 participants had taken the SEC in both years.

We believe that parts of the CEA analyses support many of the contentions made in our analyses. For example, in the CEA analyses one finds teachers reported a marked increase from 2005 to 2008 in percent of time students spent in science classes on the activities “Design investigation to solve a scientific question” and “Change variable in an experiment, test hypothesis” and a marked decrease from 2005 to 2008 in percent of time spent in science classes on the activity on “Follow step-by-step directions”. CEA analyses also one finds teachers reported a marked increase from 2005 to 2008 in percent of time students spent in mathematics classes on the activities “Explain their reasoning in solving a problem by using several sentences orally or in writing “ on “Complete or conduct proofs or demonstrations of mathematical reasoning” and “Display and analyze data”. These changes are consistent with the SMARTS inquiry- and exploration-based and technology rich design for teacher content and pedagogy training.

We include the CEA SEC analyses as an appendix to this report. We pull some summary information from that report in this section to provide evidence of the assertions made in the previous paragraph. We believe that the changes cited below are consistent with the SMARTS design for teacher inquiry-based content and pedagogy training in a technology-friendly environment.

16

Page 17: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Science ExamplesIn the CEA report see Table 8 for the instructional activity “Change variable in an experiment, test hypothesis“ and “Design investigation to solve a scientific question” in the contexts of laboratory activities, investigations or experiments. The scale in use was:

Table 8. Activities During Labs and Experiments

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Change variable in an experiment, test hypothesis

1.86 (1.03)

1 5 3 5 0 2.79 (0.97)

0 2 2 7 3

Design investigation to solve a scientific question

1.43 (0.94)

2 6 4 2 0 2.00 (0.96)

1 3 5 5 0

Note that in 2005 8 of these 14 teachers (about 57%) reported 0 or 1 (less than 10% of student science lab instructional time) spent on “Design investigation to solve a scientific question” whereas in 2008 only 4 of these 14 teachers (about 29%) reported 0 or 1. In 2005 6 of these 14 teachers (about 43%) reported 0 or 1 (less than 10% of student science lab instructional time) spent on “Change variable in an experiment, test hypothesis” whereas in 2008 only 2 of these 14 teachers (about 14%) reported 0 or 1.

Mathematics Examples:In the CEA report see Table 20 for the instructional activity “Explain their reasoning in solving a problem by using several sentences orally or in writing “ and “Complete or conduct proofs or demonstrations of mathematical reasoning””. The scale in use was:

Table 20. Instructional Activities in Mathematics, Questions 37-44

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Explain their reasoning in solving a problem by using several sentences orally or in writing

2.42 (1.30)

1 5 3 5 5 2.79 (0.98) 0 2 5 7 5

Complete or conduct proofs or demonstrations of mathematical reasoning

1.16 (1.07)

7 4 6 2 0 1.58 (1.07) 4 4 7 4 0

17

Page 18: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Note that in 2005 6 of these 19 teachers (about 32%) reported 0 or 1 (less than 10% of student mathematics instructional time) spent on “Explain their reasoning in solving a problem by using several sentences orally or in writing” whereas in 2008 only 2 of these 19 teachers (about 11%) reported 0 or 1. In 2005 11 of these 19 teachers (about 58%) reported 0 or 1 (less than 10% of student mathematics instructional time) spent on “Complete or conduct proofs or demonstrations of mathematical reasoning” whereas in 2008 only 8 of these 19 teachers (about 42%) reported 0 or 1.

In the CEA report see Table 23 for the instructional activity “Display and analyze data” in the context of educational technology. The scale in use was :

Table 23. Instructional Activities in Mathematics, Questions 58-63

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Display and analyze data 0.79

(1.03)10 5 2 2 0 1.42

(1.12)4 8 2 5 0

Note that in 2005 only 9 of these 19 teachers (about 47%) reported 0 (No use of technology) in this context for data display and analysis whereas in 2008 15 of these 19 teachers (about 80%) reported at least 1 (at least Little use of technology) in this context.

Part 3. ITBSWe collected science and mathematics ITBS student scores, with no student names attached to any scores, for all the students at schools participating in SMARTS for each one of the four academic years 2004-2005, 2005-2006, 2006-2007 and 2007-2008. The ITBS scores collected were those for four test categories of Science, Math Concepts and Estimation, Math Problem Solving and Data Interpretation and Math Computation in the ITBS instruments. Those scores were disaggregated only by teacher names and grades. With these data we were able to compare the differences in mean student scores on these tests for SMARTS vs. non SMARTS teachers during the 2005-2008 academic years.

Average ITBS scores in classrooms with a SMARTS teacher were consistently higher than the average ITBS scores in classrooms where the teacher did not participate in the SMARTS program. The direction of the difference was consistent for all 12 of the comparisons, however in only 8 of the comparisons was the difference large enough to be significant using alpha=.05.

Mean Differences (and p-values) between ITBS scores in classrooms taught by SMARTS teachers and those taught by teachers not participating in the SMARTS program

School Year Science Math Concepts Math Problem Solving

Computation

2004-05 4.74 6.51 7.38 2.85(.10) (.01) (.003) (.02)

2005-06 6.97 3.88 5.39 3.67

18

Page 19: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

(0.03) (0.12) (0.05) (0.08)2006-07 4.97 6.36 6.25 5.59

(0.14) (0.03) (0.14) (0.03)2007-08 3.44 5.25 5.51 3.24

(0.28) (0.05) (0.09) (0.07)

Because this comparison starts in the 2004-05 school year – prior to the SMARTS program, at least some of the differences are due to existing pre-differences between teachers.To assess if participation in the SMARTS program led to higher average scores on the ITBS, change over time for classrooms taught by SMARTS teachers and classrooms taught by teachers not participating in the SMARTS program were compared. Average scores in classrooms with a SMARTS teacher did not improve over time more than average ITBS scores in classrooms where the teacher did not participate in the SMARTS program.

Change over time in ITBS scores in classrooms taught by SMARTS teachers and those taught by teachers not participating in the SMARTS program

Change SMARTS Science Math Concepts Math Problem Solving Computation

Year 2 – Year 1 Yes 4.62 1.21 -0.94 -0.37

No 3.43 5.68 2.48 -1.43

Difference 1.19 -4.47 -3.43 1.07

p-value 0.62 0.14 0.22 0.53

Year 3 – Year 1 Yes -2.21 1.74 -2.05 -0.26

No -1.79 3.04 0.34 -2.37

Difference -0.42 -1.30 -2.39 2.11

19

Page 20: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

p-value 0.86 0.62 0.41 0.19

Year 4 – Year 1 Yes 1.14 2.07 -3.29 0.16

No -0.78 3.33 -1.92 -1.73

1.91 -1.26 -1.37 1.88

0.59 0.71 0.66 0.21

We note that there are higher ITBS math and science mean scores of SMARTS teachers' students compared with non SMARTS teachers' students, even in the base line year 2004-2005 before SMARTS started. That fact may be related to the teachers' desire and/or motivation to be part of the SMARTS project itself. That this same type of comparison reoccurs during the years 2005-2008 in which the SMARTS was active may not be surprising. A more nuanced measurement of the impact of SMARTS would be to examine the changes in mean ITBS math and science scores from year to year for the SMARTS teachers' students and the non SMARTS.

Here the data did not support the conclusion that SMARTS had a positive effect. This is because mean scores in classrooms with a SMARTS teacher did not improve over time more than mean ITBS scores in classrooms where the teacher did not participate in the SMARTS program. To be able to say that the data supported the conclusion that SMARTS had a positive effect on ITBS scores, one would need to show that the SMARTS classrooms to show more improvement in scores than is shown in non-SMARTS classrooms. But the global data considered did not show this.

Loosely speaking, we might say that even though the SMARTS students’ ITBS means were in fact higher, they just were not 'higher enough' to allow one to conclude that SMARTS 'boosted' the increase.

We will continue to analyze the ITBS data at finer levels by employing additional disaggregations in order to determine if subcategories within the data might reveal any additional significant differences between SMARTS and non-SMARTS teachers' students' achievement on these instruments. We are currently working with a graduate assistant in the College of Education to continue the analyses.

Part 4. LESSON STUDY IN SMARTSTo discuss the Lesson Study (LS) component of SMARTS we will pull information from several of the APRs and additional resources.An essential component of the SMARTS participant professional development and classroom and school implementation is the Lesson Study format for teacher-centered professional development. See the Year 1 Annual Progress report for additional information concerning Lesson Study implementation by SMARTS participants.

In year 2 of SMARTS, some school districts initiated a program of integrating lesson study or lesson study-like experiences for their teachers who are not SMARTS participants. These actions in part were the result of principals

20

Page 21: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

meetings w/ SMARTS teachers and staff in year 1 and principals' participation in the LS cycles (such as research lesson).

Video capture of the LS cycle by the SMARTS participant LS team from Prairie Ridge Elementary (Cedar Rapids, IA) has been used in that school district to show other schools what the LS process might look like. Clips from these video captures and video from other SMARTS LS team cycles have been shown at meetings of the entire SMARTS group and at the Year 2 principals meeting.

During the project, there were examples of non SMARTS participants, teachers and content specialists, who asked the SMARTS LS teams to join in on various components of the LS process: planning, research lesson teaching of observation, debriefing, reteach or reteach observation, process documents (LS report).

For non SMARTS teachers wishing to participate in the LS processes the staff thought such participation would be ok as long as the other teachers were aware that there would be no stipend support or substitute teacher time forthcoming for the non SMARTS teachers.

In several SMARTS schools the school administration considered LS process important enough that the school itself (not SMARTS) paid for a substitute for a non SMARTS elementary teacher to participate in the LS research lesson day activities.

That SMARTS LS was very well received by teachers and administrators in participating districts. In fact, as one district administrator stated (quoted with permission):

"I am a firm believer of "Lesson Study" for teacher collaboration. … When it is done right, the cultural change is phenomenal. In fact, I think project SMARTS is one of the best professional development models I have ever seen. It really challenges the teachers to think about how to reach all students before, during and after each lesson. We hope you will be able to … help us move the entire system to the next level. "

We include below three quotes by SMARTS participant teachers are taken from the SMARTS Web Board discussion about Lesson Study. We believe these quotes indicate some of the important learnings about and appreciation of the LS model.

"(Lesson study) is about student learning. What are our goals for the students and how do we better achieve thosegoals. …you don't have to make huge leaps to get at a better lesson. Start with the curriculum, standards and benchmarks, and desired student outcomes; then try to determine what's keeping you from achieving the desired outcomes in a particular topic or lesson, and focus on improving a piece of that. How to do you make one thingclearer and more attainable for the class?"

"We must not overlook the importance of the debriefing. I think it's important that we invite other teachers/ administrators to observe some of our planning sessions, teaching, and debriefing. This will be a good way to get people "on board." I like the idea of starting with DP as a way of getting people started. The lesson study model certainly fits in well with our district's goal of collaboration this year."

"This idea of less is more applies to our revision of our lesson study lesson that we just completed last week for the unit floating and sinking in science. After revising the original lesson to include more student interaction, the second time I taught it seemed to fly by and that we needed more time…a member of our SMARTS planning team, suggested we split the lesson into two days (to test objects to see if they floated in salt water). Such a simple solution to our time crunch! I can't wait to try it."

We can say generally that for many or even most of the teachers involved, we saw evidence of real growth in their ability to work on understanding and delivering math and science content in their classrooms. Many of the teachers were initially fearful of participating in the lesson study process-which includes significant collaborative planning as well as outside observers watching the classroom lesson delivery. Similar hesitancy was expressed by some teachers

21

Page 22: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

about implementing Distributed Practice quizzes, which allowed for a wide variety of math problem types and study discussions.

By the end of the grant there were many teachers who came to value the collaborative lesson study process and distributed practice pedagogy. In fact some viewed the collaboration process as a setting in which they could work comfortably on learning new math and science content in a positive learning environment.

We also heard from several teachers that their classroom observation skills were improved by the lesson study research lesson implementation protocols. Both as the teaching teachers and as research lesson observers, a good number of the teachers indicated that those experiences contributed to their enhanced sensitivity to student thinking and learning in the classrooms.

As a culminating event for SMARTS highlighting the LS work, we held a SMARTS Lesson Study conference on April 22, 2008, in Coralville, Iowa at the Brown Deer Country Club. This was the culminating SMARTS event about with 150 participants, SMARTS and non SMARTS teachers, school administrators, AEA consultants, university faculty. Agenda included background on SMARTS implementation of LS, round table discussions on SMARTS teacher benefits from LS, LS team poster presentations on student learning, research lesson presentation/observations on-site by a SMARTS LS team and class of fourth grades, with debriefing discussions.

We note one more point related to this conference. A number of GWAEA schools which were not part of SMARTS sent a principal and several teachers to the conference. In one such school one of the conference-attending teachers made the following comment to the entire teaching staff of that school about lesson study based on his observations at the conference-"The teachers who are doing it love it."

We feel that this positive view of the Lesson Study process as implemented in SMARTS conveys the ownership that SMARTS teachers took of the lesson study model. See the next section on SMARTS impact outside of SMARTS for additional evidence of the LS work in this project.

One of the PIs (Walter Seaman) participated in several conferences in which he gave talks about the SMARTS project, including the LS implementation. Those talks have generated follow-up discussions on the MSP program and the SMARTS project in particular.

Part 5. SMARTS IMPACT BEYOND SMARTS PARTICIPANTSSMARTS impact beyond the SMARTS participants has occurred in several settings. We first discuss the Lesson Study model used in SMARTS and its impact. Evidence of this impact was discussed in the Year 3 APR (on-line) submitted to the USDE June 30, 2008. We quote from that report below.

"VII. Program Evaluation

I. Other Impacts.

One local school district, College Community, which is home to two schools (Prairie Ridge and Prairie Heights) decided to implement lesson study as a district-wide professional development strategy (K-12). We know from discussions with several CCSD administrators that the decision to do this was significantly influenced (positively) by the work teachers did in the SMARTS grant.

Additionally, we know from discussions with teachers, school administrators and university faculty at the SMARTS Lesson Study conference April 22, 2008, that there is a growing interest in this process on a wider scale. SMARTS staff members have been approached after the conference with requests for help in bringing lesson study resources, information and expertise to those schools."

We note also from the Year 3 APR documentation of the use by SMARTS participants of their uses of SMARTS-related experiences in non SMARTS settings. We quote from that APR below with evidence.

"VII. Program Evaluation

G. Impact on Teachers

22

Page 23: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

One district, College Community, which is home to two schools (Prairie Ridge and Prairie Heights) some of the lesson study teams became quite actively involved in documenting their lesson study processes. They video taped various parts of the lesson study cycles and put together a short 'documentary' of their process. The resulting video was used to illustrate the implementation of the lesson study process at meetings with other schools."

We note that during the Fall 2008/Spring 2009 school year (after SMARTS had ended) members of the SMARTS staff were contacted by two SMARTS schools, Van Allen Elementary and Penn Elementary for assistance with their plans to implement the Lesson Study model as a whole-school PD model. SMARTS has assisted those schools by acting as a resource of information and materials. For example the SMARTS project has purchased a set of lesson study DVDs which will be donated to each of the two schools. Information from a number of those DVDs was used as part of the training during the SMARTS project activities 2005-2008 and it is hoped will serve to introduce this model on a school-wide basis.

Another setting in which SMARTS has had impact beyond the SMARTS participants has occurred in the uses of SMARTS content, pedagogy and participant feedback information to inform University of Iowa mathematics classes for pre service teachers, and information from University of Iowa mathematics classes for pre service teachers to inform elements of the SMARTS project.

Examples of these phenomena include the following below (we would be happy to provide additional documentation of these).

Uses in University of Iowa mathematics classes for pre service teachers of documented elementary school student errors and error patterns in arithmetic computations. The error patterns served as a basis for the pre service teachers to analyze the mathematical and learning difficulties the patterns revealed. The existence of documentation of such error patterns was initially encountered in one of the PI's work in SMARTS.

Uses in University of Iowa mathematics classes for pre service teachers of SMARTS Distributed Practice problems as the basis of assignments on mathematics and elementary school student responses. For example a DP problem was shown to a class of pre service teachers, along with instructions to solve the problem themselves, perhaps in several ways, and to give an example of what they would think a good elementary school student solution to that problem would consist of.

Uses in University of Iowa mathematics classes for pre service teachers of reading resources used in SMARTS. For example, a teaching vignette on subtraction algorithms taken from the NRC publication How Students Learn Mathematics in the Classroom (copyright 2005 by the National Academy of Sciences). This vignette originally was posted on the SMARTS web board on (during Year 2 of the project) as a discussion topic starter. Along with the vignette SMARTS staff posted questions about the mathematics and pedagogy involved for the SMARTS participants to respond to. One of those questions asked the SMARTS teachers to suggest questions about this vignette which might be posed to pre service teachers, which would help them understand what is important in conducting a mathematics lesson with elementary school students.

SMARTS teachers' responses were used to pose questions to the pre service teachers, and resulting pre service teacher responses were tallied and shared with those pre service teachers as well as with the SMARTS teachers. The pre service teachers were also asked to respond to questions about the mathematics and pedagogy involved in the vignette.Another example of the use of reading resources used in SMARTS serving as a resource for pre service teachers was the use of an article about the use of writing in mathematics classrooms as a catalyst to better mathematics learning. The article was originally examined and discussed in the SMARTS setting. Then later this same article was used as an extra credit assignment for pre service teachers (they had to answer several questions about the the mathematics and pedagogy involved in the article).

Uses in University of Iowa mathematics classes for pre service teachers of experiences and observations in elementary school mathematics classroom observations by one of the PIs (Walter Seaman). The ability to

23

Page 24: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

cite observations about 'actual' elementary school student classroom work was a source of rich information both for the pre service teachers and the instructor of the class for the pre service teachers.

Finally we include a note about work on another MSP project on which one of the PIs (Walter Seaman) is working on after the SMARTS grant ended. It is the EMPOWERR (Elementary Mathematics Partnership Opening Windows to Excellence, Rigor and Relevance) Project. The entire mathematics-teaching staffs of the SMARTS schools Kirkwood Elementary and Strawberry Hill Elementary, as well as that of the non SMARTS school English Valleys Elementary are participating in this mathematics content and lesson study implementation vehicle project. Teachers who participated in SMARTS were invited to participate in EMPOWERR activities, even if their school was not one of the EMPOWERR schools. Those teachers would not receive a stipend but were eligible to receive the UI course credit at a discounted rate. Two teachers from the SMARTS school Shellsburg Elementary have been fully participating in EMPOWERR activities, including work as participants in the summer 2008 EMPOWERR workshop, and the October 29, 2008 EMPOWERR Academic Year Seminar. We believe that their participation reflects a positive view they formed of the experiences they had in SMARTS. That these non EMPOWERR teachers would fully participate indicates that they were willing to use their personal time to continue studying mathematics and lesson study in that new grant.

Part 6. SMARTS CUMULATIVE BUDGET SUMMARY

We include below summary information on how the total budgeted SMARTS MSP funds were used. We would be happy to provide additional details if that would be useful. The day-to-day budgetary and fiscal administrative business of the SMARTS grant were handled by the Science Education program of The University of Iowa College of Education.

24

Page 25: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Description of Budgeted Items BudgetedExpense Totals Balance Total

PersonnelTotal Salaries and Fr inge $96,398 $106,748 ($10,350)

Consultants/Presenters not above $62,925 $77,917 ($14,992)

Participant SupportStipends $165,240 $130,919 $34,321

R & B $9,153 $11,229 ($2,076)

Supplies ( includes workshop supplies, curriculum materials, photocopying, printing, lab supplies for workshops, books, equipment rental, office supplies, final Lesson Study Conference). *Curriculum materials going to teachers and school districts ($11,641.64) is itemized and described below. $11,150 $32,222 ($21,072)

Travel-Administrative $0 $6,050 ($6,050)

Other

A. Project Assessment - Center for Evaluation and Assessment $14,400 $2,400 $12,000B. Survey of Enacted Curriculum (pre and post) $3,240 $1,252 $1,988C. Other - Sub release pay to school districts $46,575 $33,188 $13,387

Total Direct Costs $409,081 $401,925 $7,156Indirect Costs @ 10% of previous Line $40,908 $40,193 $715Total Costs $449,989 $442,118 $7,871

*Curriculum MaterialsVendor Amount

Sage Publications $4,798Greenwood Heineman $700Mills College $700

NCREL $1,876

NSTA $848

ICCSD (Reimbursment for books, etc.) $1,075Amazon $816HMCO $828

$11,642

Description

Leading Lesson StudyOutdoor Inquiries

The Teaching GapGeometry to Go; Math to Go

Lesson Study Videos

Teacher to Teacher: Reshaping Instruction through Lesson Study

Inquiry and the Science Education Standards

Lesson Study Books and Lesson Study DVDs (such as How many seats, can you find the area, The secret of Magnets, etc.)

IV. Supplemental Information

We outline below some successes and challenges encountered in SMARTS Project

25

Page 26: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Successes1. The Lesson Study model was very well received. In one SMARTS district LS has been adopted district-wide as their PD model (across subjects and grades). The decision to do this was positively influenced by the LS experiences of SMARTS teachers in that district and their discussions/talks about this process with their colleagues and administrators (this was confirmed by district administrators).Two SMARTS schools in another district are paying cost-sharing costs with SMARTS to continue LS training for the entire school staff (this is post-SMARTS).

2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving strategies.

3. Large consultant staff afforded a large talent pool to draw from and help with large variety for SMARTS tasks (LS expertise, math and science content assistance, logistics for meetings easier with larger staff). Consultants’ familiarities with schools and teachers was key in gaining access to schools and building on existing relationships with teachers and administrators.

Challenges1. Lesson Study Logistics challenges-finding times and locations for planning. Team make up challenges-different grades, different schools, curricula, etc. Ownership of research lesson work by entire team. These challenges and possible solutions were the subject of focus-group discussions at the Year 2 summer institute. Those challenges and solutions were reported in the SMARTS Year 2 Interim Report (See Appendix A1 August 2006 Workshop Lesson Study Challenges).A problem cited by a number of teachers was that of having teachers of different grades in the same lesson study team, and how to choose topics for the research lesson in that setting. Most of the SMARTS lesson study teams had members from different grades (and in some cases even from different shools). We quote below from the discussions in the August 2006 SMARTS meeting about this topic.

1. Problem: A LS team has teachers for grades 1,2, 3, 3-how can they do a reteach of a first grade LS research lesson when there is only one first grade teacher on the LS team?Solutions: i. Go into another person’s first grade class and teach it. Even if you do not teach first grade, get permission to teach a different first grade teacher’s first grade class and teach the LS research lesson to those students.ii. Revise the lesson and teach it again to the same (first grade) students.iii. Ask a different first grade teacher to do the reteach to their class, even if the other first grade teacher is not part of the LS group.iv. “Pull out” parts of the research lesson topics and teach those to one of the other grade levels.

4. Problem: Different grade levels and curricula. Ownership if some do not use that curriculum (e.g. the one used for the research lesson). Solutions: i. Use storylines to understand big ideasii. Go to kit training (all LS team members)

5. Problem: (same as 4) Different grade levels and curricula. Ownership if some do not use that curriculum (e.g. the one used for the research lesson). Solutions: i. Dissect the unit togetherii. Have knowledgeable other present to talk about similarities in curricula from content point of viewiii. Open topic applicable to multiple grade levels”

2. Keeping principals and administrators involved. Some were very active, others minimally involved. Some schools had very few teachers in SMARTS, rendering principal prioritization of SMARTS problematic.

26

Page 27: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

3. Data collection. Access to SMARTS teachers’ students ITBS data (with names attached for tracking) necessitated University of Iowa Human Subjects Office permission forms with signatures from students and their parents/guardians. Forms were 5 pages single-spaced documents. Form return rates were always well below 50%.HSO process was quite long, complex and time-consuming.Requests for release of ITBS data to SMARTS staff required permission letters from district administrators to Iowa Testing Programs. In some cases permission letters were sent to ITP only months later (up to six months).

4. Whole school involvement. SMARTS served a few teachers from a large number of schools, creating "Islands of excellence" but did not systematically impact the entire school. Maintaining strong principal involvement and support is difficult when there only one to three SMARTS teachers in the school.

The Iowa Professional Development Model's (IPDM) research findings document that substantial improvements and teaching and learning only occur when all site personnel are engaged in a school-wide PD program.

5. Focus on both science and math and integration sometimes restricted ability to take advantage of opportunities to go deeper in one topic vs. the other.

Appendix 1. University of Iowa Educational Measurement and Statistics program ITBS analysis

SMARTS Report(NAME)1/26/2009

Average ITBS scores in classrooms with a SMARTS teacher were consistently higher than the average ITBS scores in classrooms where the teacher did not participate in the SMARTS program. The direction of the difference was consistent for all 12 of the comparisons, however in only 8 of the comparisons was the difference large enough to be significant using alpha=.05.

Mean Differences (and p-values) between ITBS scores in classrooms taught by SMARTS teachers and those taught by teachers not participating in the SMARTS

programSchool Year Science Math Concepts Math Problem

SolvingComputation

2004-05 4.74 6.51 7.38 2.85(.10) (.01) (.003) (.02)

2005-06 6.97 3.88 5.39 3.67(0.03) (0.12) (0.05) (0.08)

2006-07 4.97 6.36 6.25 5.59(0.14) (0.03) (0.14) (0.03)

2007-08 3.44 5.25 5.51 3.24(0.28) (0.05) (0.09) (0.07)

27

Page 28: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Because this comparison starts in the 2004-05 school year – prior to the SMARTS program, at least some of the differences are due to existing pre-differences between teachers.To assess if participation in the SMARTS program led to higher average scores on the ITBS, change over time for classrooms taught by SMARTS teachers and classrooms taught by teachers not participating in the SMARTS program were compared. Average scores in classrooms with a SMARTS teacher did not improve over time more than average ITBS scores in classrooms where the teacher did not participate in the SMARTS program.

Change over time in ITBS scores in classrooms taught by SMARTS teachers and those taught by teachers not participating in the SMARTS program

Change SMARTS Science Math Concepts

Math Problem Solving

Computation

Year 2 – Year 1 Yes 4.62 1.21 -0.94 -0.37

No 3.43 5.68 2.48 -1.43

Difference 1.19 -4.47 -3.43 1.07

p-value 0.62 0.14 0.22 0.53

Year 3 – Year 1 Yes -2.21 1.74 -2.05 -0.26

No -1.79 3.04 0.34 -2.37

Difference -0.42 -1.30 -2.39 2.11

p-value 0.86 0.62 0.41 0.19

Year 4 – Year 1 Yes 1.14 2.07 -3.29 0.16

28

Page 29: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

No -0.78 3.33 -1.92 -1.73

1.91 -1.26 -1.37 1.88

0.59 0.71 0.66 0.21

Appendix 2. University of Iowa Center for Evaluation and Assessment SEC Analyses

SMARTS Consultation

Survey of Enacted Curriculum (SEC), Pre- Post Data Analysis

Prepared for: Walter Seaman & John Dunkhase

The University of Iowa

29

Page 30: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

U.I. Center for Evaluation and Assessment

Evaluation Staff: NAMELead Project Evaluation Manager

emailhttp://www.education.uiowa.edu/cea/

January 2009

INTRODUCTION This report was prepared for the SMARTS project leaders, for the purpose of writing the final project report. The U.I. Center for Evaluation and Assessment staff consulted with Walter Seaman in November 2008 regarding data analysis for selected questions on the Survey of Enacted Curriculum (SEC). Alissa Minor from SEC provided the raw data in Excel files to the CEA staff in late December 2008.

METHODOLOGYAfter receiving the Excel spreadsheets, (name) matched participants using the “Visitor ID” variable, which uniquely identified each participant. There were a total of 38 math participants who completed the SEC in either 2005 or 2008. Of those 38, there were 22 participants who had taken the SEC at both time points. There were a total of 39 science participants who completed the SEC in either 2005 or 2008. Of those 39, 14 participants had taken the SEC at both time points. The data analysis for this report included only the participants who had taken the SEC in 2005 and in 2008.

Due to the small numbers of matched participants, descriptive statistics are reported for each of these groups. It was determined that significance tests were not appropriate, given the low numbers of matched participants. Questions should be directed to (name) at the U.I. Center for Evaluation and Assessment.

RESULTS – SCIENCE There were 14 science participants who completed the SEC, Science at both time points. Results are presented for each of the questions identified as important by the SMARTS project leaders. Within the content areas the organization is by the SEC section headings, starting with professional development and then instructional practices.

30

Page 31: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

There were three questions focused on the frequency of professional development (PD) activities. Each question had two parts, one focused on the frequency of PD and the second focused on the number of PD hours. Respondents were asked to “consider all the professional development activities related to science content or science education that you have participated in since June 1st of last year.” Professional development was defined in varied ways for each of the three questions.

Question 101 focused on workshops and in-service training, which was defined as “In-service training is professional development offered by your school or district to enhance your professional responsibilities and knowledge. Workshops are short- term learning opportunities that can be located in your school or elsewhere.” The item stem for question 101 was:

For the time period referenced above, how often, and for how many total hours, have you participated in workshops or in-service training related to science or science education?

Tables 1 and 2 present the frequencies of participation for 2005 and for 2008, which was question 101 parts a and b.

Table 1. Frequency of Participation in Workshops or In-Service Training (101a)

Frequency of response2005 2008

Never (0) 1 5Once (1) 5 1Twice (2) 4 03 to 4 times (3) 1 35 to 10 times (4) 2 5> 10 times 1 0

Table 2. Hours of participation in workshops or in-service training (101b)

Frequency of response2005 2008

N/A (0) 1 51 to 6 hours (1) 6 17 to 15 hours (2) 4 316 to 35 hours (3) 2 236 to 60 hours (4) 1 061 + hours (5) 0 3

The frequencies for questions 101a and 101b are also represented in the bar graphs in Figures 1 and 2.

31

Page 32: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Figure 1. Frequency of Response, Question 101a

The mean of item 101a for 2005 was 2.07 (standard deviation = 1.44) and the mean for 2008 was 2.14 (standard deviation = 1.83).

Figure 2. Frequency of Response, Question 101b

The mean of item 101b for 2005 was 1.17 (standard deviation = 1.07) and the mean for 2008 was 2.00 (standard deviation = 1.96).

Question 102 focused on summer institutes, which were defined as “longer term professional learning opportunities, for example, of a week or longer in duration.” The item stem for question 102 was:

For the time period referenced above, how often, and for how many total hours, have you participated in summer institutes related to science or science education?

32

Page 33: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Tables 3 and 4 present the frequencies of participation for 2005 and for 2008, which was question 102 parts a and b.

Table 3. Frequency of Participation in Summer Institutes (102a)

Frequency of response2005 2008

Never (0) 9 5Once (1) 2 7Twice (2) 0 03 to 4 times (3) 1 15 to 10 times (4) 2 1> 10 times 0 0Table 4. Hours of participation in Summer Institutes (102b)

Frequency of response2005 2008

N/A (0) 9 31 to 6 hours (1) 1 17 to 15 hours (2) 1 216 to 35 hours (3) 1 436 to 60 hours (4) 2 361 + hours (5) 0 1

The frequencies for questions 102a and 102b are also represented in the bar graphs in Figures 3 and 4.

Figure 3. Frequency of Response, Question 102a

The mean of item 102a for 2005 was 0.93 (standard deviation = 1.54) and the mean for 2008 was 1.00 (standard deviation = 1.18).

33

Page 34: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Figure 4. Frequency of Response, Question 102b

The mean of item 102b for 2005 was 1.00 (standard deviation = 1.57) and the mean for 2008 was 2.43 (standard deviation = 1.65).

Question 103 focused on college courses related to science or science education. The item stem for question 103 was:

For the time period referenced above, how often have you attended college courses related to science or science education and about how many hours did you spend in class?

Tables 5 and 6 present the frequencies of participation for 2005 and for 2008, which was question 103 parts a and b.

Table 5. Frequency of Participation in Science-Related College Courses (103a)

Frequency of response2005 2008

Never (0) 10 8Once (1) 3 4Twice (2) 0 13 to 4 times (3) 1 15 to 10 times (4) 0 0> 10 times 0 0

Table 6. Hours of participation in Science-Related College Courses (103b)

Frequency of response2005 2008

N/A (0) 11 91 to 6 hours (1) 0 1

34

Page 35: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

7 to 15 hours (2) 1 216 to 35 hours (3) 1 236 to 60 hours (4) 0 061 + hours (5) 1 0

The frequencies for questions 103a and 103b are also represented in the bar graphs in Figures 5 and 6.

Figure 5. Frequency of Response, Question 103a

The mean of item 103a for 2005 was 0.43 (standard deviation = 0.85) and the mean for 2008 was 0.64 (standard deviation = 0.93).

Figure 6. Frequency of Response, Question 103b

The mean of item 103b for 2005 was 0.71 (standard deviation = 1.54) and the mean for 2008 was 0.79 (standard deviation = 1.19).

35

Page 36: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

The next questions of interest were also centered on professional development, including items 128 to 138. The lead statement for each of these items was:

Since June 1st of last year, how much emphasis did your professional development activities in science or science education place on the following topics?

The scale for these items was:

Four participants were deleted due missing data from the 2005 administration for these 11 items. Table 6 presents the frequencies of responses, means, and standard deviations of these items for 2005 and for 2008. Table 6. Descriptive Statistics, Questions Focused on Science-Related Emphasis in PD

2005 2008Item stem Mean (sd) Frequency Mean (sd) Frequency

0 1 2 3 0 1 2 3State science content standards 0.80 (1.03) 6 0 4 0 1.60 (0.97) 1 4 3 2

Alignment of science instruction to curriculum

1.60 (1.35) 3 2 1 4 1.80 (0.92) 1 2 5 2

Instructional approaches 2.10 (1.20) 2 0 3 5 2.30 (0.95) 1 0 4 5

In-depth study of science or specific concepts in science

0.90 (1.10) 5 2 2 1 1.80 (0.92) 1 2 5 2

Study of how children learn particular topics in science

1.20 (0.92) 2 5 2 1 2.10 (0.99) 1 1 4 4

Individual differences in student learning

0.90 (0.74) 3 5 2 0 1.70 (1.06) 1 4 2 3

Meeting the learning needs of special populations of students

0.60 (0.70) 5 4 1 0 0.90 (0.57) 2 7 1 0

Classroom science assessment 1.00 (1.05) 4 3 2 1 1.60 (1.17) 2 3 2 3

State or district science assessment

0.80 (1.03) 5 3 1 1 1.10 (1.20) 4 3 1 2

Interpretation of assessment data for use in science instruction

0.90 (1.10) 5 2 2 1 1.20 (1.14) 3 4 1 2

Technology to support student learning in science

0.70 (0.67) 4 5 1 0 0.90 (0.88) 3 6 0 1

Figure 6 displays the 2005 and 2008 means for questions 128 through 138.

Figure 6. Means for Questions 128-138, 2005 and 2008

36

Page 37: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Instructional Activities in Science

The results in this section focus on the reported instructional activities in science, including the type and frequency of use of various pedagogies and student activities. Descriptive statistics are reported for each year of SEC administration.

Questions 25 to 36 focused on the types of activities students were engaged in during science instruction. The directions indicate that activities are not mutually exclusive, and percentages will likely exceed 100%. The zero-to-four scale represented the following percentages:

The question for each item stem was:

How much of the science instructional time in the target class do students use to engage in the following tasks?

Table 7 includes the frequencies, means, and standard deviations for questions 25 to 36.

Table 7. Instructional Activities in Science, Questions 25-36

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Listen to the teacher explain something about science to the class as a whole

2.21 (0.70)

0 2 7 5 0 2.29 (0.91)

0 2 8 2 2

37

Page 38: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Read about science in books, magazines, articles

1.57 (0.76)

0 8 4 2 0 2.00 (0.88)

0 4 7 2 1

Work individually on science assignments

1.79 (0.89)

0 7 3 4 0 1.43 (0.94)

2 6 4 2 0

Write about science in a report or paper

1.14 (0.86)

4 4 6 0 0 1.57 (0.85)

0 8 5 0 1

Do a laboratory activity, investigation, or experiment

3.57 (0.51)

0 0 0 6 8 3.14 (1.10)

0 2 1 4 7

Watch teacher demonstrate a scientific phenomenon

1.36 (0.63)

0 10 3 1 0 1.14 (1.03)

3 8 2 0 1

Collect data (other than laboratory activities)

1.71 (0.83)

1 4 7 2 0 1.93 (1.33)

3 2 3 5 1

Work in pairs or small groups (other than labs)

2.79 (1.12)

0 2 4 3 5 2.71 (1.44)

2 1 1 5 5

Mean (sd)

Frequencies Mean (sd)

Frequencies0 1 2 3 4 0 1 2 3 4

Do a science activity outside the classroom or lab

1.07 (0.73)

3 7 4 0 0 1.14 (1.03)

4 6 2 2 0

Use computers, calculators, other technology for science

0.93 (0.73)

4 7 3 0 0 0.79 (0.58)

4 9 1 0 0

Maintain and reflect on a science portfolio of their own science work

2.14 (1.29)

2 1 7 1 3 2.29 (1.38)

2 2 3 4 3

Take a quiz or test 1.57 (0.85)

0 8 5 0 1 1.29 (0.61)

0 11

2 1 0

Figure 7 displays the 2005 and 2008 means for questions 25 through 36.

Figure 7. Means for Questions 25-36, 2005 and 2008

38

Page 39: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

The next set of questions, 37-45, focused on the types of activities that occur during labs and experiments in the science classroom. The same zero-to-four scale that was used in the prior set of questions is used here.

The question for each item stem was:

When students in the target class are engaged in laboratory activities, investigations, or experiments as part of science instruction, how much of that time do they:

Table 8 includes the means, standard deviations (sd), and frequencies of responses for these items.

Table 8. Activities During Labs and Experiments, Questions 37-45

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Make educated guesses, predictions, or hypotheses

3.00 (0.96)

0 1 3 5 5 2.71 (0.83)

0 0 7 4 3

Follow step-by-step directions

2.57 (0.94)

0 2 4 6 2 1.71 (0.83)

1 4 7 2 0

Use science equipment or measuring tools

3.29 (0.73)

0 0 2 6 6 3.00 (0.68)

0 0 3 8 3

Collect data 3.29 (0.73)

0 0 2 6 6 3.36 (0.63)

0 0 1 7 6

Change variable in an experiment, test hypothesis

1.86 (1.03)

1 5 3 5 0 2.79 (0.97)

0 2 2 7 3

Organize and display information in tables, graphs

2.07 (1.14)

2 2 3 7 0 2.29 (1.14)

0 4 5 2 3

Analyze and interpret science data

2.57 (0.94)

0 2 4 6 2 2.71 (0.99)

0 1 6 3 4

Design investigation to solve a scientific question

1.43 (0.94)

2 6 4 2 0 2.00 (0.96)

1 3 5 5 0

Make observations/classifications

3.29 (0.73)

0 0 2 6 6 3.21 (0.80)

0 0 3 5 6

Figure 8 displays the 2005 and 2008 means for questions 37 through 45.

39

Page 40: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Figure 8. Means for Questions 37-45, 2005 and 2008

Questions 46 through 51 focused on the amount of instructional time in small groups or pairs. The scale used was:

The question for each item stem was:

When students in the target class work in pairs or small groups as part of science instruction (other than in the science laboratory), how much of that time do they:

Table 9 includes the means, standard deviations (sd), and frequencies of responses for these items.

Table 9. Amount of Science Instruction in Small Groups, Questions 46-51

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Talk about ways to solve science problems (e.g, design an experiment)

2.64 (0.74)

0 1 4 8 1 2.93 (0.62)

0 0 3 9 2

Complete written assignments from the textbook or workbook

0.93 (1.27)

7 4 1 1 1 0.79 (0.89)

6 6 1 1 0

Write results/ present from a lab activity, experiment, or a research project

2.14 (1.23)

1 4 3 4 2 1.71 (1.07)

2 4 4 4 0

40

Page 41: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Work on an assignment, report, or project over an extended period of time

1.21 (1.19)

5 4 2 3 0 1.64 (1.01)

1 6 5 1 1

Work on a writing project or entries for portfolios by seeking peer comments to improve work

0.43 (0.85)

11 0 3 0 0 1.00 (1.04)

5 6 1 2 0

Review assignments or prepare for a quiz or test

0.93 (1.07)

5 7 1 0 1 1.07 (0.83)

4 5 5 0 0

Figure 9 displays the 2005 and 2008 means for questions 46 through 51.

Figure 9. Means for Questions 46-51, 2005 and 2008

Questions 52 through 56 focused on the amount of time spent collecting science data or information. The scale used was:

The question for each item stem was:

When students in the target class collect data or information about science from books, magazines, computers, or other sources (other than laboratory activities), how much of that time do they:

Table 10 includes the means, standard deviations (sd), and frequencies of responses for these items.

41

Page 42: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Table 10. Amount of Time Spent Collecting Science Data, Questions 52-56

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Have class discussions about the data

2.36 (1.34)

1 4 1 5 3 2.43 (1.22)

1 2 4 4 3

Organize, display the information in tables, graphs

1.21 (1.05)

4 5 6 2 0 1.71 (1.33)

2 6 2 2 2

Make a prediction based on the data

1.71 (0.99)

2 3 6 3 0 2.21 (1.12)

1 2 6 3 2

Analyze the information or data orally or in writing

1.64 (1.08)

3 2 6 3 0 2.14 (1.10)

1 2 7 2 2

Make a presentation to the class on the data, analysis, or interpretation

1.36 (1.34)

5 3 3 2 1 1.71 (1.20)

2 5 3 3 1

Figure 10 displays the 2005 and 2008 means for questions 52 through 56.

Figure 10. Means for Questions 52-56, 2005 and 2008

Questions 57 through 62 focused on the amount of time spent collecting science data or information. The scale used was:

42

Page 43: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

The question for each item stem was:

When students in the target class are engaged in activities that involve the use of calculators, computers, or other educational technology as part of science instruction, how much of that time do they:

Table 11 includes the means, standard deviations (sd), and frequencies of responses for these items.

Table 11. Amount of Time Spent Using Educational Technology, Questions 57-62

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Learn facts 1.36

(1.28)5 2 5 1 1 0.93

(0.92)5 6 2 1 0

Practice procedures 1.07 (1.44)

8 1 2 2 1 0.93 (1.07)

6 5 1 2 0

Use sensors and probes (e.g., CBLs)

0.07 (0.27)

13 1 0 0 0 0.07 (0.27)

13 1 0 0 0

Retrieve or exchange data or information (e.g., Internet)

1.07 (1.14)

6 3 3 2 0 1.07 (0.92)

4 6 3 1 0

Display and analyze data 0.57 (1.09)

10 2 0 2 0 1.07 (1.14)

6 3 3 2 0

Solve problems using simulations

0.29 (0.61)

11 2 1 0 0 0.57 (0.85)

8 5 0 1 0

Figure 11 displays the 2005 and 2008 means for questions 57 through 62.

Figure 11. Means for Questions 57-62, 2005 and 2008

43

Page 44: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

RESULTS – MATH There were 22 math participants who completed the math SEC at both time points. Results are presented for each of the questions identified as important by the SMARTS project leaders. Within the content areas the organization is by the SEC section headings, starting with professional development and then instructional practices.

There were three questions focused on the frequency of professional development (PD) activities. Each question had two parts, one focused on the frequency of PD and the second focused on the number of PD hours. Respondents were asked to “consider all the professional development activities related to science content or science education that you have participated in since June 1st of last year.” Professional development was defined in varied ways for each of the three questions.

There were three of the 22 math respondents with no responses for these items in the 2005 administration. These three respondents were not included in the analysis. Therefore, the results for the 19 respondents with complete 2005 and 2008 data are reported in this section.

Question 102 focused on workshops and in-service training, which was defined as “In-service training is professional development offered by your school or district to enhance your professional responsibilities and knowledge. Workshops are short- term learning opportunities that can be located in your school or elsewhere.” The item stem for question 102 was:

44

Page 45: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

For the time period referenced above [June 1st of last year], how often, and for how many total hours, have you participated in workshops or in-service training related to mathematics or mathematics education?

Tables 12 and 13 present the frequencies of participation for 2005 and for 2008, which was question 102 parts a and b.

Table 12. Frequency of Participation in Workshops or In-Service Training (102a)

Frequency of response2005 2008

Never (0) 6 4Once (1) 1 0Twice (2) 1 23 to 4 times (3) 4 65 to 10 times (4) 6 4> 10 times 1 3

Table 13. Hours of Participation in Workshops or In-Service Training (102b)

Frequency of response2005 2008

N/A (0) 6 51 to 6 hours (1) 6 17 to 15 hours (2) 3 316 to 35 hours (3) 3 736 to 60 hours (4) 1 261 + hours (5) 0 1

The frequencies for questions 102a and 102b are also represented in the bar graphs in Figures 12 and 13.

Figure 12. Frequencies for Question 102a

45

Page 46: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

The mean for item 102a for 2005 was 2.32 (standard deviation = 1.83) and the mean for 2008 was 2.79 (standard deviation = 1.72). The mean for item 102b for 2005 was 1.32 (standard deviation = 1.25) and the mean for 2008 was 2.16 (standard deviation = 1.57).

Figure 13. Frequencies for 102b

46

Page 47: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Question 103 focused on summer institute training, which were defined as “longer term professional learning opportunities, for example, of a week or longer in duration.” The item stem for question 103 was:

For the time period referenced above [June 1st of last year], how often, and for how many total hours, have you participated in summer institutes related to mathematics or mathematics education?

Tables 14 and 15 present the frequencies of participation for 2005 and for 2008, which was question 103 parts a and b.

Table 14. Frequency of Participation in Summer Institute Training (103a)

Frequency of response2005 2008

Never (0) 12 7Once (1) 4 6Twice (2) 1 13 to 4 times (3) 0 35 to 10 times (4) 2 2> 10 times 0 0

Table 15. Hours of Participation in Summer Institute Training (103b)

Frequency of response2005 2008

N/A (0) 13 81 to 6 hours (1) 1 27 to 15 hours (2) 0 216 to 35 hours (3) 2 336 to 60 hours (4) 3 461 + hours (5) 0 0

The frequencies for questions 103a and 103b are also represented in the bar graphs in Figures 14 and 15.

Figure 14. Frequencies for Question 103a

47

Page 48: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

The mean for item 103a for 2005 was 0.74 (standard deviation = 1.28) and the mean for 2008 was 1.32 (standard deviation = 1.42). The mean for item 103b for 2005 was 1.00 (standard deviation = 1.63) and the mean for 2008 was 1.63 (standard deviation = 1.67).

Figure 15. Frequencies for Question 103b

The next questions of interest were also centered on professional development, including items 128 to 138. The lead statement for each of these items was:

For the time period referenced above, how often have you attended college courses related to mathematics or mathematics education and about how many hours did you spend in class?

Tables 16 and 17 present the frequencies of participation for 2005 and for 2008, which was question 104 parts a and b.

Table 16. Frequency of Participation in Math-Related College Courses (104a)

Frequency of response2005 2008

Never (0) 19 13Once (1) 0 1

48

Page 49: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Twice (2) 0 23 to 4 times (3) 0 25 to 10 times (4) 0 1> 10 times 0 0

Table 17. Hours of participation in Math-Related College Courses (104b)

Frequency of response2005 2008

N/A (0) 19 131 to 6 hours (1) 0 07 to 15 hours (2) 0 216 to 35 hours (3) 0 336 to 60 hours (4) 0 161 + hours (5) 0 0

The frequencies for questions 104a and 104b are also represented in the bar graphs in Figures 16 and 17.

Figure 16. Frequencies for Question 104a

49

Page 50: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

The mean and standard deviation for item 104a for 2005 and 2008 was zero. The mean for item 104b for 2005 was 0.79 (standard deviation = 1.32) and the mean for 2008 was 0.89 (standard deviation = 1.41).

Figure 17. Frequencies for Question 104b

The next questions of interest were also centered around professional development, including items 129 to 139. The lead statement for each of these items was:

Since June 1st of last year, how much emphasis did your professional development activities in math or math education place on the following topics?

Six participants were deleted due missing data from the 2005 administration for these 11 items. There were a total of 16 respondents included in the analysis of items 129 to 139. The scale used was:

Table 18 presents the frequencies of responses, means, and standard deviations of these items for 2005 and for 2008.

Table 18. Descriptive Statistics, Questions Focused on Math-Related Emphasis in PD

2005 2008Item stem Mean (sd) Frequency Mean (sd) Frequency

0 1 2 3 0 1 2 3State mathematics content 1.06 (1.18) 7 4 2 3 1.31 (1.14) 5 4 4 3

50

Page 51: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

standardsAlignment of mathematics instruction to curriculum

1.69 (1.08) 2 6 3 5 1.75 (1.06) 3 2 7 4

Instructional approaches (e.g., use of manipulatives)

1.38 (1.02) 3 7 3 3 1.88 (0.96) 2 2 8 4

In-depth study of, or specific concepts within mathematics

1.00 (1.10) 7 4 3 2 1.50 (1.03) 3 5 5 3

Study of how children learn particular topics in mathematics

1.31 (1.20) 5 5 2 4 1.56 (1.09) 3 5 4 4

Individual differences in student learning

1.31 (1.08) 4 6 3 3 1.75 (1.18) 3 4 3 6

Meeting the learning needs of special populations of students

0.63 (0.81) 9 4 3 0 1.38 (1.02) 4 4 6 2

Classroom mathematics assessment

1.25 (1.00) 4 6 4 2 1.25 (0.77) 3 6 7 0

State or district mathematics assessments

1.38 (0.89) 3 5 7 1 1.13 (0.96) 5 5 5 1

Interpretation of assessment data for use in math instruction

1.31 (0.87) 3 6 6 1 1.19 (0.91) 4 6 5 1

Technology to support student learning in mathematics

0.75 (0.77) 7 6 3 0 1.00 (0.73) 4 8 4 0

Figure 18 displays the 2005 and 2008 means for questions 129 through 139.

Figure 18. Means for Questions 129-139, 2005 and 2008

51

Page 52: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Instructional Activities in Math

The results in this section focus on the reported instructional activities in math, including the type and frequency of use of various pedagogies and student activities. Descriptive statistics are reported for each year of SEC administration.

Questions 25 to 36 focused on the types of activities students were engaged in during science instruction. Three respondents were missing data from the entire 2005 administration and were not included in this part of the analysis. There were a total of 19 respondents included.

The directions indicate that activities are not mutually exclusive, and percentages will likely exceed 100%. The zero-to-four scale represented the following percentages:

The question for each item stem was:

How much of the mathematics instructional time in the target class do students use to engage in the following tasks?

Table 19 includes the frequencies, means, and standard deviations for questions 25 to 36.

Table 19. Instructional Activities in Mathematics, Questions 25-36

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Watch teacher demonstrate how to do a procedure or solve a problem

2.42 (1.02)

0 4 6 6 3 2.11 (0.81)

0 4 10 4 1

Read about math in books, magazines, or articles

0.89 (0.94)

7 9 1 2 0 0.74 (0.65)

7 10

2 0 0

Take notes from lectures or the textbook

0.63 (0.76)

9 9 0 1 0 1.00 (1.05)

7 7 4 0 1

Complete computational exercises or procedures from textbook or worksheet

2.63 (1.21)

0 5 3 5 6 2.58 (1.07)

0 4 4 7 4

Present or demonstrate solutions to a math problem to the whole class

2.53 (1.02)

1 2 4 10 2 2.89 (1.10)

0 3 3 6 7

Use manipulatives,measurement instruments,

2.74 (0.99)

0 3 3 9 4 2.79 (0.92)

0 1 7 6 5

52

Page 53: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

and data collection devicesWork individually on math exercises, problems, investigations, or tasks

2.74 (0.93)

0 2 5 8 4 2.21 (1.03)

1 3 8 5 2

Work in pairs or small groups on math exercises, problems, investigations, tasks

3.05 (0.85)

0 1 3 9 6 2.89 (0.81)

0 1 4 10 4

Do a math activity with the class outside the classroom

0.79 (0.63)

6 11 2 0 0 0.74 (0.65)

7 10

2 0 0

Use computers, calculators, or other technology to learn mathematics

1.63 (0.90)

0 11 5 2 1 1.68 (0.89)

0 10

6 2 1

Maintain and reflect on a mathematics portfolio of their own work

1.21 (0.98)

5 7 5 2 0 1.58 (1.30)

4 7 3 3 2

Take a quiz or test 1.74 (0.93)

0 10 5 3 1 1.42 (1.02)

2 11

3 2 1

Figure 19 displays the 2005 and 2008 means for questions 25 through 36.

Figure 19. Means for Questions 25-36, 2005 and 2008

53

Page 54: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

Next, questions 37 to 44 focused on the amount of time individual students spend working on various activities during mathematics courses. The zero-to-four scale represented the following percentages:

The question for each item stem was:

When students in the target class work individually on mathematics exercises, problems, investigations, or tasks, how much of that time do they:

Table 20 includes the frequencies, means, and standard deviations for questions 37 to 44.

Table 20. Instructional Activities in Mathematics, Questions 37-44

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Solve word problems from a textbook or worksheet

2.00 (0.88)

0 7 5 7 0 1.89 (0.81) 0 7 7 5 0

Solve non-routine mathematical problems

2.05 (1.08)

1 5 7 4 2 2.05 (1.03) 1 4 9 3 2

Explain their reasoning in solving a problem by using several sentences orally or in writing

2.42 (1.30)

1 5 3 5 5 2.79 (0.98) 0 2 5 7 5

Apply math concepts to “real world” problems

2.42 (0.90)

0 4 4 10

1 2.58 (0.96) 0 2 8 5 4

Make estimates, predictions, or hypotheses

2.37 (0.76)

0 3 6 10

0 2.37 (0.90) 0 3 8 6 2

Analyze data to make inferences, draw conclusion

1.89 (1.05)

1 7 5 5 1 2.21 (0.92) 0 5 6 7 1

Work on a problem that takes at least 45 minutes to

0.74 (0.87)

9 7 2 1 0 0.68 (0.89) 10 6 2 1 0

54

Page 55: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

solveComplete or conduct proofs or demonstrations of mathematical reasoning

1.16 (1.07)

7 4 6 2 0 1.58 (1.07) 4 4 7 4 0

Figure 20 displays the 2005 and 2008 means for questions 37 through 44.

Figure 20. Means for Questions 37-44, 2005 and 2008

Questions 45 to 52 focused on instructional activities that occurred in small groups or pairs. The scaled used was:

The question for each item stem was:

When students in the target class work in pairs or small groups on mathematics exercises, problems, investigations, or tasks, how much of that time do they:

Table 21 includes the frequencies, means, and standard deviations for questions 45 to 52.

Table 21. Instructional Activities in Mathematics, Questions 45-52

2005 2008Item stem Mean Frequencies Mean Frequencies

55

Page 56: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

(sd) (sd)0 1 2 3 4 0 1 2 3 4Solve word problems from a textbook or a worksheet

1.79 (0.98)

1 7 7 3 1 2.00 (1.05) 0 8 5 4 2

Solve non-routine mathematical problems

2.32 (1.29)

2 4 2 8 3 2.16 (1.21) 1 6 4 5 3

Talk about their reasoning in solving a problem

2.53 (1.35)

2 3 2 7 5 2.84 (0.96) 0 1 7 5 6

Apply math concepts to “real world” problems

2.32 (1.11)

1 5 1 11

1 2.58 (1.07) 0 3 7 4 5

Make estimates, predictions, or hypotheses

2.26 (1.28)

2 4 3 7 3 2.53 (1.02) 0 3 7 5 4

Analyze data to make inferences, draw conclusion

1.84 (1.21)

3 5 4 6 1 2.21 (1.03) 0 5 8 3 3

Work on a problem that takes at least 45 minutes to solve

0.84 (1.12)

9 7 1 1 1 0.68 (1.16) 12 4 1 1 1

Complete or conduct proofs of their mathematical reasoning

1.21 (1.40)

8 5 2 2 2 1.53 (1.17) 4 6 5 3 1

Figure 21 displays the 2005 and 2008 means for questions 45 through 52.

Figure 21. Means for Questions 45-52, 2005 and 2008

Questions 53 to 57 were focused on the use of hands-on instructional material in mathematics classrooms. The scale used was:

56

Page 57: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

The question used for each item stem was:

When students in the target class use hands-on material, how much of that time do they:

Table 22 includes the frequencies, means, and standard deviations for questions 53 to 57.

Table 22. Instructional Activities in Mathematics, Questions 53-57

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Work with manipulatives to understand math concepts

2.74 (0.99)

0 2 6 6 5 2.84 (0.96)

0 1 7 5 6

Measure objects using tools such as rulers, scales, or protractors

2.26 (0.81)

0 3 9 6 1 2.47 (0.96)

0 3 7 6 3

Build models or charts 2.11 (1.24)

2 4 6 4 3 2.32 (0.95)

0 4 7 6 2

Collect data by counting, observing, or conducting surveys

1.95 (1.18)

1 7 6 2 3 2.16 (1.01)

0 6 6 5 2

Present information to others using manipulatives

2.42 (1.07)

1 2 7 6 3 2.42 (1.02)

1 2 6 8 2

Questions 58 through 63 focused on the use of educational technology during mathematics instruction. The scale was:

The question used for each item stem was:

57

Page 58: 1 · Web view2. The Distributed Practice mathematics pedagogy in classes-teachers reported increased richness of students’ mathematical explanations and alternative problem-solving

When students in the target class are engaged in activities that involve the use of calculators, computers, or other educational technology as part of mathematics instruction, how much of that time do they:

Table 23 includes the frequencies, means, and standard deviations for questions 58 to 63.

Table 23. Instructional Activities in Mathematics, Questions 58-63

2005 2008Item stem Mean

(sd)Frequencies Mean

(sd)Frequencies

0 1 2 3 4 0 1 2 3 4Learn facts 1.42

(1.02)3 8 6 1 1 1.16

(0.96)5 8 4 2 0

Practice procedures 2.00 (1.29)

3 3 7 3 3 1.89 (1.15)

2 6 4 6 1

Use sensors and probes 0.21 (0.42)

15 4 0 0 0 0.32 (0.67)

15 2 2 0 0

Retrieve or exchange data or information

0.42 (0.61)

12 6 1 0 0 0.74 (0.93)

10 5 3 1 0

Display and analyze data 0.79 (1.03)

10 5 2 2 0 1.42 (1.12)

4 8 2 5 0

Develop geometric concepts (e.g., using simulations)

0.74 (0.99)

9 8 1 0 1 0.74 (0.93)

10 5 3 1 0

58