getting better all the time assuring the quality of cosf data andy gomm: new mexico part c jane...
TRANSCRIPT
Getting Better All the Time
Assuring the Quality of COSF DataAndy Gomm: New Mexico Part C
Jane Atuk: Alaska Part C
Lisa Backer: Minnesota Part C & 619
ECO Implementation in NM
Training provided to 34 provider agencies at their sites
ECO manual developed and distributed Technical assistance made available
through FIT staff and University of NM – Early Childhood Network
Roll out region by region (5 regions)
ECO quality assurance in NM
ECO Quality Assurance form developed ECO lead staff with the Family Infant
Toddler (FIT) Program initially reviewed all ECO forms
Review expanded to 4 FIT staff Total ECO forms reviewed to date =
approximately 1,300
ECO quality assurance in NM (cont.)
Each provider agency received specific feed back regarding rating selection and supporting documentation.
Once it was determined that the agency was completing the ECO forms to a high standard – they could be ‘graduated’
Once graduated FIT staff request the ECO forms on an “as needed” basis
Additional ECO quality assurance
Providers receive a summary of the ECO quality assurance conducted
Data entered in new online data system – provides additional opportunities to review accuracy
Database reports provide ability to review whether ECO scores have been entered
ECO Quality Assurance Form
The NM ECO review form includes: Are all areas of the ECO form completed? Were a minimum of three sources of info
(approved assessment tool, clinical observation and parent input) used to generate rating?
Does the supporting evidence really support the ECO rating?
Is the ECO rating consistent with the child’s eligibility category?
Lessons Learned
After initial training, all sites needed an additional, almost identical, training once they began implementation.
TA needs to be available promptly. Pre-printing sources of information on the
supporting evidence section ensured that documentation was present from all three required sources.
Lessons Learned (cont)
Regarding Feedback on ECO Form: Feedback needs to be prompt. Feedback needed to go directly to Service
Coordinators completing the form, and not just their EC Coordinator (manager).
Positive feedback works!! If a particular SC at an agency was doing a great job with the ECO form, a recommendation was made that that SC mentor others at that agency. Use his / her ECO form as an example of what we want.
Next Steps
Develop online training – available 24 / 7 Promote QA to be done by provider
managers Review online ECO reports – e.g. review
data reports for patterns in scores, etc. Include ECO process (incl. ECO Manual)
in the Service Coordination training
Basic Realities Education Lead/Birth Mandate State “Local Control” is valued Teams must use multiple sources of
information including at least one criterion-referenced or curriculum-based measure cross-walked by ECO
Parent input must be documented on the COSF
Basic Realities Single target group of stakeholders &
professionals for training on child outcomes reporting across Parts C and Part B
Rating at exit from Part C is becomes the entrance rating for Part B
Minnesota Automated Reporting Student System (MARSS) created in the late 1980’s.
No “real time” data. Data collected by LEA’s throughout the year and reported to MDE each fall and each end-of-year
Stakeholder Roles/Responsibilities
Key Areas Knowledge of typical child development Ongoing Assessment Knowledge and Use of COSF & Process Annual reporting of data Ensuring validity Family Outcomes
Training & TA
“Get Started” 55 Face-to-face trainings during Year 1 Data Retreat for Early Childhood
Program Administrators (ECSE, Head Start, Pre-K) to promote professional investment in data
One time additional appropriation of $$ to fund tool purchase and training
Training & TA
“Get Better” 7 Regional Trainings in Year 2 Program survey LEAs; Provide training
on most popular assessment tools HELP; AEPS; BDI-2; Brigance; Creative
Curriculum Web-Ex training under development for
implementation during Fall 2008 Validation Self-Study
Data Quality & Awareness
Simple logic check Mean, Median and Standard Deviation
calculated on entry and exit data sets for each LEA for each outcome.
Progress data calculated and made available for each LEA on password protected site
Does district data tell the right story?
COSF Entry Data-District A
N=44Median Mean
Standard Deviation
Outcome 1 6 5.16 1.88
Outcome 2 4 3.75 1.80
Outcome 3 5 4.48 1.60
Correlation: Outcome 1 x Outcome 2
11 22 33 44 55 66 77 Total
1 38 7 5 1 2 53
2 24 68 18 8 3 4 125
3 8 25 40 10 8 1 1 93
4 7 30 22 9 4 3 75
5 2 5 15 33 42 15 10 122
6 1 7 14 20 34 48 9 133
7 1 2 5 3 20 21 40 92
Total 74 121 127 97 116 93 65 693
Self Study
Self-study tool under development Procedural Requirements Sources of Information Assignment of Ratings
Statewide training on use of tool 10/2/08
Lessons Learned & Next Steps
Lessons:
1. Getting started was easy. Getting better takes more work.
Next Steps:
2. Vigilant monitoring of all data submissions
3. Evaluate local use of self-study tool
Alaska
Jane AtukEarly Intervention Specialist
Early Intervention/Infant Learning Program
Jane AtukEarly Intervention Specialist
Early Intervention/Infant Learning Program
COSF implementation in Alaska COSF implementation in Alaska
COSF pilot at 7 regional sites, Feb-Dec 2006 Training provided to all providers at statewide
workshop, Feb 2007 Statewide implementation of COSF began
March 1, 2007 DVD training modules provided to each
regional program, Nov 2007 and now accessible online for ongoing local training
Quality assurance in Alaska Quality assurance in Alaska
Technical assistance provided through state staff by phone and at regional sites
COSF database reports reviewed at least quarterly with feedback to local providers
Provider survey conducted July 2008
Survey Notes Survey Notes
92 ILP providers received the survey link by email (Survey Monkey)
67 responded for a 73% overall response rate
The number of responses on items varies because…
Subsets of respondents received some questions based on answers to other questions (skip logic)
Respondents could choose to not answer some questions
COSF training & information COSF training & information
90% of respondents answered an item about how they received COSF training/information
Of these (n = 60)… 70% attended an in-person statewide event 42% used the COSF training notebook 37% consulted with trained ILP providers 30% consulted with state-level staff 18% used DVD training modules* 7% used the Internet to access information
*DVD training modules were only available after statewide training events occurred
*DVD training modules were only available after statewide training events occurred
78% felt they could do the COSF process with varying confidence, but without further training 78% felt they could do the COSF process with varying confidence, but without further training
Overall Proficiency with COSF Overall Proficiency with COSF
I know how to do it, but I need some more practice and assistance.
I am confident I know how to do it, and I do it well.
I understand to a point, but I need more training.
I do not know how to do this yet.
28
24
12
2(n = 66)
Sources of Information Sources of Information
6161
5454
5454
4444
1111
99
ILP provider observationsILP provider observations
Parents/foster parents/legal guardiansParents/foster parents/legal guardians
Assessment results/test scoresAssessment results/test scores
Specialists (OT/PT, speech/language, etc.)Specialists (OT/PT, speech/language, etc.)
Other family members/relativesOther family members/relatives
Childcare providersChildcare providers
Note: Respondents were asked to “check any that apply”
The most typical resources used to inform COSF rating decisions (n = 64)
Gathering Information Gathering Information
6363
1818
88
66
Meeting with people in personMeeting with people in person
Meeting with people over phone or teleconferenceMeeting with people over phone or teleconference
Communicating back and forth with people by emailCommunicating back and forth with people by email
Videotaping interviews, assessments, observationsVideotaping interviews, assessments, observations
Note: Respondents were asked to “check all that apply”
The most typical methods used to gather information for COSF ratings (n = 64)
Decision-Making Tools Decision-Making Tools
Were crosswalks helpful?4
verymuch
8yes
9somewhat
3no
21don’t know if using
21not using
Were instructions for completing the COSF helpful?36yes
8no
6don’tknow
14not using
Was the decision tree helpful?24
very much24yes
6some
2no
3don’tknow
3not
using
Determining COSF Ratings Determining COSF Ratings
Most commonly… 33% consulted with another provider 24% consulted with families 21% determined ratings on their own 18% used a team process
It would seem that providers most often did not use an “ideal” team approach
It would seem that providers most often did not use an “ideal” team approach
Note: 3 (4%) respondents did not answer this question.
Determining COSF Ratings Determining COSF Ratings
However…
63% (42) had used a team approach at times
Of these 42 providers… 64% felt the team approach enhanced the
decision-making process
62% felt it contributed information that would otherwise not be available
95% felt it was relatively easy to reach consensus
Level of Parental Involvement Level of Parental Involvement
Typical parental involvement in COSF process on teams (n = 42)…
69% - contributed information, but were not usually present during team meetings
26% - usually were present and participated
5% - usually were not involved at all
Anchor Assessment Tools Anchor Assessment Tools
(n = 63) Note: Respondents were asked to “check any that apply”
2525
1919
1717
1616
1616
33
33
33
Battelle Developmental Inventory (BDI)Battelle Developmental Inventory (BDI)
Early Learning Accomplishments Profile (ELAP, 2002)Early Learning Accomplishments Profile (ELAP, 2002)
Sewell Early Education Developmental Profile (SEED)Sewell Early Education Developmental Profile (SEED)
Early Learning Intervention Dev. Profile (“the Michigan”) Early Learning Intervention Dev. Profile (“the Michigan”)
Hawaii Early Learning Profile (HELP, 2004)Hawaii Early Learning Profile (HELP, 2004)
Assessment, Evaluation, & Programming System (AEPS)Assessment, Evaluation, & Programming System (AEPS)
Bayley-III Scales of Infant & Toddler Development, 3rd ed.Bayley-III Scales of Infant & Toddler Development, 3rd ed.
Carolina Curriculum for Infants & Toddlers (CCITSN-3)Carolina Curriculum for Infants & Toddlers (CCITSN-3)
Anchor Assessment Tools Anchor Assessment Tools
45 providers indicated training specific to assessment tools from…
91% local EI/ILP agency 27% assessment authors/publishers 20% university course 16% professional conference 13% state or regional workshop 11% private consultant or contracted trainer 7% another organization
Anchor Assessment Tools Anchor Assessment Tools
Recentness of training (n = 45)… 24% within the last year 31% within the last two years 18% within the last five years 27% more than five years ago
43 of 61 (64%) respondents indicated someone else in their program has training/education
specific to anchor tools used
43 of 61 (64%) respondents indicated someone else in their program has training/education
specific to anchor tools used
73%
Added Comments Added Comments
20 providers (30%) added a comment to the survey 5 were clarifications of answers given 6 expressed objections to using the COSF 3 expressed difficulty with the COSF process 2 indicated confusion with the COSF process 3 were suggestions 1 was about the survey itself
16% of respondents made what could be considered negative comments
16% of respondents made what could be considered negative comments
Lessons Learned & Next Steps Lessons Learned & Next Steps
Train often and early Regular feedback is essential Providers appreciate being asked to
give feedback on process
• Survey results will help to focus future training and technical assistance• Continue to elicit feedback from providers
• Survey results will help to focus future training and technical assistance• Continue to elicit feedback from providers