To help build a more robust professional learning network for educators so they, in turn, could improve outcomes for students with disabilities, the U.S. Department of Education (U.S.DOE) Office of Special Education Programs, in 2009, awarded Arkansas a substantial State Personnel Development Grant (SPDG). As part of the grant, the U.S.DOE recommended that the state engage an outside evaluator to measure the impact of their efforts. However, one year into the five‐year grant cycle, Arkansas’ third‐party evaluator retired, and the state needed to quickly identify a qualified replacement. Public Sector Consultants (PSC) prepared a detailed project proposal and, after a rigorous selection process, was chosen as Arkansas’ new external evaluator.
Instead of recommending immediate changes, we first spent a great deal of time with Arkansas’ SPDG team learning about its current evaluation methods and processes, carefully reviewing the grant requirements, and understanding their existing professional development programs’ goals and objectives. What we found was that while Arkansas had an extensive measurement‐driven evaluation (using about 47 different metrics), it largely focused on the number of professionals reached as opposed to the impact of its programs on educators.
Given that the federal government was actively encouraging grantees to move away from this type of evaluation, PSC began working collaboratively with SPDG team members to redesign their evaluation methods, streamline the process, and shift the focus away from the number of individuals served and toward program outcomes and system sustainability. At every stage, we asked our project partners: What data can we collect, and in what ways, to help Arkansas better tell its story?
While many organizations have expertise in evaluation, or special education, or professional development, or working within the SPDG network, few have experience in all those areas. At PSC, however, we had demonstrated know‐how in every aspect of the project, which meant we could quickly step in to address the full range of Arkansas’ needs.
We were particularly well versed in the range of issues that special education teachers and their students face in the classroom and how to design and conduct surveys, focus groups, community forums, and interviews to elicit meaningful and necessary input.
Finally, we were willing to take a longer‐term, more collaborative approach to evaluation. Whereas some third‐party evaluators come in, gather and analyze data, and give the results to their client at the end of the project, we followed best practices by working in concert with the Arkansas SPDG team throughout the grant cycle, using a team approach to fully engage them in making decisions, designing and re‐designing strategies for assessing their objectives and activities, and using the data gathered to make informed decisions and continuous program improvements.
One of the things I value most in working with PSC is their professional and participatory approach to the project. They model how to come together to examine data from various sources, assess the quality and usefulness of that data, and determine how best to apply it to improve our own practice. Working with PSC is some of the best professional development I’ve ever participated in.
SPDG Director/State Systemic Improvement Plan Coordinator
Special Education Unit, Arkansas Department of Education
Despite having different project directors, different state leadership, and changing federal views about how best to approach measurement, in the end PSC helped Arkansas adapt quickly and develop a suite of products (including secure data systems, customized reporting, and infographics to assist with data visualization) to demonstrate the success of their efforts.
With our help and encouragement, the SPDG team also successfully transitioned its professional development measurement efforts away from quantity to quality, and away from effort (the number of trainings scheduled and people reached) to sustainability (increasing capacity within the system so it remains and expands after the funding ends) — an approach that has been adopted for the next five‐year SPDG grant for which we’ll continue providing evaluation services.
Finally, SPDG team members improved their own knowledge and practice during the project. They were not afraid to ask hard questions or receive negative feedback from us. Instead, they welcomed our input at every stage so they could learn how to make their own adjustments along the way.
Why It Matters
At PSC, we’re dedicated to using both our content knowledge and our evaluation expertise to make things better. In this case, our combined efforts provided Arkansas the data needed to develop higher‐quality professional development programs that, ultimately, should improve student learning.
We particularly enjoy working with clients — like Arkansas — who appreciate evaluation and understand its potential for making change. Going into this project, we shared a similar philosophy — if providing quality instruction for students in safe environments with well‐trained educators is what education is all about, then we have a responsibility to determine whether the state’s programs are helping to achieve that goal. If not, we need to gather reliable data, analyze it thoughtfully, and use it to figure out the best ways to get there.