Last summer, I began working at the UGA Marine Education Center and Aquarium as a Public Service and Outreach (PSO) Graduate Assistant for the University of Georgia Marine Extension and Georgia Sea Grant. This twelve-month experience supported two semesters of my doctoral journey while affording me the privilege to learn more on how to evaluate informal science education programs in a real-world setting. Needless to say, I was incredibly excited about this opportunity as I have been interested in getting my feet wet with program evaluation for several years.
Initially, the plan was to enroll in an evaluation course over the fall semester. This course was pivotal in providing the theories and foundation that would assist me in my new role as a PSO graduate assistant. My tentative plan was to learn evaluation theory in the classroom and then put those theoretical assumptions into practice. However, the evaluation course was shuffled to another semester, leaving me in a lurch, much like a baker trying to bake a cake without flour!
Despite this setback, I contacted evaluators from past projects and received a list of pertinent introductory texts. Some were dry, others profound, and still others created more questions than answers. When additional questions arose, supportive faculty within UGA Marine Extension and Georgia Sea Grant introduced me to evaluation experts in another PSO unit, the J.W. Fanning Institute for Leadership Development.
This open and collaborative environment among the PSO units encouraged conversations and meetings with expert evaluators at Fanning who offered guidance and advice on the evaluative projects I was planning to implement. What a fantastic network of experts in a wide range of fields, and what an incredible group of faculty eager to support a graduate student’s project!
With text and a supportive network in tow, I began working on logic models, program surveys and program evaluations. My projects included creating new evaluation tools and redesigning existing tools. Additionally, I had opportunities to field test these instruments. Great learning moments have happened this year through these practical, real-world experiences. I believe that program evaluation, especially in the informal science context, is incredibly important for visitors and staff alike.
This was the place where you heard kids and adults go from “I can’t” to “Let’s do that again!”
I realized the importance of effective program evaluation while working with a wonderful science education facility in Tennessee that ended up closing its doors. I believe that insufficient program evaluation data was to blame. This was a popular facility that offered incredible experiential science-based learning opportunities to students in surrounding counties. While working there, I took kids into the great outdoors, not just to teach, but to assist them in experiencing science concepts.
We went canoeing on a river, spelunking in a wild cave and hiking along part of the Appalachian Trail. Kids were learning while doing and the doing wasn’t easy. This program empowered its participants, and it built confidence among widely diverse groups of students and adults. This was the place where you heard kids and adults go from “I can’t” to “Let’s do that again!”
However, despite its popularity, this facility lost funding and eventually closed. Would the implementation of program evaluation have prevented the loss of funding? I’ll never know, but I find it hard to believe that a program so loved could just be lost forever. Could program evaluation data have captured the years of life-altering anecdotal evidence to document the learning and personal growth that occurred at that facility? YES! Most definitely, yes, and maybe this data could have been used to secure resources. I can only speculate.
Don’t get me wrong—there is a long mile between UGA Marine Extension and Georgia Sea Grant and my Tennessee memory. My point is that even the greatest programs come on hard times, and data from evaluative tools might prove beneficial during those times. UGA Marine Extension and Georgia Sea Grant offer many exciting experiential and informal science-based learning opportunities, and great importance lies in continual evaluation of these programs.
Through this year as a PSO graduate assistant, I have been rewarded on many levels and I hope to have made a positive impact in assisting with program evaluation strategies. I am grateful for this amazing opportunity to have practiced life as an evaluator and I genuinely would like to thank Jennifer Frum, Mark Risse and Anne Lindsay for this year’s opportunity!