Behavior Change Training: How Research Shows the Impact of Learning
Tell us more about the importance of evidence in behavior change training. What makes data “evidence”?
When it comes to behavior change training, you want to be able to say that changes in learner behavior were a result of the training itself. There are a lot of definitions of what we refer to as evidence-based. The way we view it is that the data, associated statistical analysis, and the design of the experiment provide evidence that the differences you are seeing as the result of the simulation are truly due to the simulation. Ideally, this requires a randomized controlled trial where you are looking at the differences between an experimental (or treatment) group compared to a control group.
In Kognito’s research, the experimental group participants complete the Kognito simulation and the control group does not. Then via our surveys, we compare the differences between those two groups. All things being equal between the two groups, the differences can be attributed to the treatment, or the simulation in this case. So that’s evidence that the simulation is having an impact.
This is why we can say that the simulations change learner attitudes of being better prepared and more self-confident to manage a challenging conversation, which in turn leads to changes in behavior.
Why is it important to our clients that they’re using a tool that’s evidence-based?
You want to know that the tool works! There are many examples of things that we wouldn’t do without evidence. For example, we wouldn’t take a prescription drug that’s not evidence-based. You would want it to have been proven effective in clinical trials. When it comes to a behavior change training tool, you want to know that it will make an impact on learners to make a change in your organization. So for our clients, it makes sense to invest resources in a training that has been proven to work.
Kognito simulations incorporate evidence-based techniques, yet our research shows that our simulations are evidence-based. Are these the same thing?
There’s a difference between what is going into and what is coming out of the product.
In our simulations, we provide people with an opportunity to practice and learn using evidence-based communication strategies, which are integrated into the conversation platform that drives the learning experience. An important evidenced-based strategy is motivational interviewing. This means that motivational interviewing has been scientifically proven to work – in this case through over 100 studies. Other communication strategies featured in our simulations are also evidence-based – like emotional regulation, mentalization, and demonstrating empathy.
In the last Q&A you told us more about Kognito’s data collection methodology. How does Kognito’s data become original research?
When we review our survey data, we initially assess sample size. Sample size determines whether you can run a statistical analysis. When we have a sufficient number of completed surveys from baseline to post-simulation to follow-up, we export the data and run the analysis.
The data we collect constitutes original research, since it has not been published. The types of analysis we do will ultimately be in a format that is part of a manuscript that we submit to a publisher, where it will undergo a peer review.
What are the types of studies you conduct to show whether Kognito is effective?
We conduct a number of different studies. Some of them are case studies where you are looking at the impact of the simulation on learners within a particular institution. Then we have field studies where we typically do a quasi-experimental, between-group design, which means that there isn’t true random assignment, but we still have two groups, the experimental and control. These are very common studies when evaluating the impact on thousands of participants from around the country where you do not have the luxury of a controlled environment.
All studies that we’ve published have been reviewed by the Institutional Review Boards of the institutions that run the studies. These boards are designed to ensure that participants have a consent form so they fully know the risks and benefits of the simulation and how data will be handled – that is, anonymous, de-identified, on a secure server, and preserved for a certain number of years.
Another characteristic of our studies is that they are longitudinal – we are examining the long-term impact over an extended period of time. This is why we implement follow-up surveys two or three months after someone completes a simulation – to see if the learning is sustained from the behavior change training. In order to measure changes in behavior, you can only do that with a longitudinal study by comparing baseline data to data several months later. If there are any changes when you compare the experimental group to the control group, you can assume they are the result of the simulation with all other things being equal.
What sorts of resources are out there for organizations or clients to read more about evidence-based tools?
There are a couple of ways to go about this. You can do a Google Scholar search and look for journal publications in the areas of interest. In addition, the Suicide Prevention Resource Center (SPRC) has a listing of evidence-based programs that were in SAMHSA’s National Registry of Evidence-Based Programs and Practices (NREPP) before that program was discontinued. When looking at their resources and programs, there is a filter to display only programs with evidence of effectiveness, with more details on the evidence and a note about inclusion in NREPP if applicable.
What is a study you’ve worked on that stands out to you?
My favorite study is one that we just completed with the U.S. Navy’s 21st Century Sailor initiative. Over 1,000 Sailors took the Kognito Together Strong simulation which covers topics like adjusting to post-deployment life, help-seeking, and stigma reduction. We divided the Sailors from U.S. and European bases into a control group and an experimental group, and analyzed the data.
The results were amazing. We found that the impact of the simulation on the Sailors’ helping attitudes significantly increased in terms of being more prepared, likely, and self-confident in identifying fellow Sailors in psychological distress, to talk with them about their concerns, and if necessary, make a referral to support services such as the Chaplain.
Results also showed that Sailors were more likely to self-refer if they themselves started to experience psychological distress. The simulation also reduced personal stigma about mental health, and it was beginning to reduce public stigma. If you can imagine an entire ship, or naval base, completing the training and thus reducing mental health stigma, you would then see a cultural shift that truly supports mental health. That would be a profound effect. We’ve presented the results at the American Association of Suicidology Conference this year, and shortly a manuscript will be presented for review. I’m proud of that research and the team at Kognito.