Webinar Transcript: NIJ Evaluation of the Byrne Criminal Justice Innovation Program
The page presents the transcript of a webinar on the NIJ Evaluation of the Byrne Criminal Justice Innovation Program. The research project is a collaboration not only between the Bureau of Justice Assistance and the National Institute of Justice but also, the BCJI Grantees, their research community partners, and the BCJI Training and Technical Assistance team.
Speaking in this webinar:
- Linda Truitt, Ph.D., Senior Social Science Analyst, NIJ
- Alissa Huntoon, Senior Policy Advisor, BJA
- Mary Jo Giovacchini, National Criminal Justice Reference Service
MARY JO GIOVACCHINI: Good afternoon, everyone, and welcome to today's webinar, NIJ Evaluation for the Byrne Criminal Justice Innovation Program. At this time, I would like to introduce our presenters, Dr. Linda Truitt, Senior Social Science Analyst at the National Institute of Justice, and Alissa Huntoon, Senior Policy Advisor at the Bureau of Justice Assistance.
LINDA TRUITT: Good afternoon everyone, this is Linda Truitt at NIJ. Welcome to our first webinar on the NIJ Evaluation of the Byrne Criminal Justice Innovation Program. We hope to continue communications with the BCJI Grantees and their site teams over the life of the this research project using emails, webinars, meetings, and other opportunities to provide project updates, disseminate research findings, and solicit feedback on research participation and other things. The research project is a collaboration not only between the Bureau of Justice Assistance and the National Institute of Justice but also the BCJI Grantees, their research community partners, and the BCJI Training and Technical Assistance team. As you may recall from Alissa Huntoon’s recent email, NIJ awarded the Phase 1 evaluation project to Dr. Natalie Hipple at Indiana University. She was able to observe BJA's recent grantee workshop and is working with us currently to finalize research plans. The purpose of this first webinar is to provide an overview of the project beginning from the perspective of BJA and their research goals.
[see slide #3] I will then describe NIJ's response to BJA’s request for research assistance and the specific tasks planned for the first phase of this research project. It is important to clarify expectations so I will also describe research activities that collect information compiled by BJA or that may involve BCJI Grantee participation. Last, we will review project milestones including plans for written documents and other dissemination. We hope that you took advantage of our recent request for questions when registering for the webinar. In the final segment, we will discuss questions and comments that we have received from potential participants either from the webinar registration form or during this webinar. At the end, you will find contact information, instruction on how to submit questions after this live webinar, and where to find the webinar archive.
ALISSA HUNTOON: Thank you, Linda. Hello, everybody, thank you so much for joining us for this webinar today. As Linda mentioned, , this is the first opportunity we have to share information with you about this research and we look forward to continued opportunities to keep you in the loop about what is going on regarding the evaluation and what that means to you as a BCJI grantee, either current or past. I want to begin with some context about the Byrne Criminal Justice Innovation Program under which you all have received funds, so this shouldn't be new information to you.
[see slide #4] It started in 2012, and since that time, we've awarded nearly 43 million to about 65 sites. Those funds have gone out under a variety of categories—whether planning dollars, implementation funds, and then combined planning and implementation grants. As the program has grown, we've also adapted and changed it to be responsive to the needs of the field. The point of the BCJI program is to take a new approach to persistent crime issues in neighborhoods around the country and to connect those approaches with revitalization. Overall, the goal is to support local communities in developing comprehensive solutions that approach these persistent crime issues in new ways. That involves taking a look at and digging into what has been driving crime in specific places within that neighborhood or area. Key to the BCJI program is the role of residents in problem identification as well as problem solutions. As with many other BJA programs, BCJI sites are required to partner with researchers to really look at data and do some analysis on what's driving crime and then to select data and/or evidence informed strategies that directly tie back to those crime drivers. And then as I mentioned, it's a comprehensive set of solutions. It’s never just intervention, or just enforcement, or just prevention; it's a range and these are connected to revitalization efforts in the neighborhood for long-term impact.
Let’s talk about why does BJA support program evaluation, and why in particular concerning the BCJI program?
[see slide #5] BJA has supported and funded evaluation across several of its programs over the years. That's really to ensure that we're building and maintaining effective programs, and that the results of the evaluation research help us to understand, define, and demonstrate program impact. It always helps us to refine the program model and also support program and policy development. Specific to BCJI, BJA’s questions focus on how is the program model working overall? Is there a way that we can understand and describe themes across the sites? Are there common sets of strategies that are being employed successfully in support of the BCJI model? Is there a common lexicon or language and features that we can use to describe what's working and what's working well? Are sites using models that appear to show success, and are there certain models that we want to raise up? Also, how is program impact manifesting itself across the sites, and how is success being described? For example, do changes to physical space, creation of new partnerships, and leveraging other local, state, federal resources support program goals? And of course, have there been reductions in crime in the target area as compared to overall or historically in that neighborhood? Are residents involved in new and engaging ways? Those are just some examples of the kinds of the things that we want to explore.
Also, in building this program, BJA provides funding to a TTA provider to help you—the sites— achieve local success and implement the program. Inherent in that and in this research is really looking at what role does the TTA play in helping sites achieve local success and how does that support the model and program development in growth and in general? And lastly, there have been different categories of funding. What has been the impact of those various funding categories? As you can see, there's lots of questions and I'm going to turn it over to Linda now who's going to outline the specifics of this particular research effort.
LINDA TRUITT: Thank you. You can see this is quite a comprehensive program with a lot of challenges, but also a lot of engaging interest areas. In collaboration with BJA, NIJ developed a research project that is responsive to BJA's immediate need for feedback on the long-standing BCJI program. Phase 1 also lays the groundwork for future potential research to include more robust impact and cost studies.
[see slide #6] The research project has several goals. First to document the overall BCJI program goals, objectives, design, and history. We're observing the grant activities to assess site operations and model fidelity. We're assessing site capacity for information collection, data management and analysis. We’re identifying types of concerns that have been targeted and approaches that have been developed. We’re identifying key indicators of outcomes not limited to crime reduction and community safety, given the range of projects and target populations, and folks have been very creative in identifying quite an array of areas to explore. We're documenting and assessing the TTA activities as they correspond to BCJI site needs and how satisfied are the grantees with those services. Finally, we’re assessing evaluabililty to inform future plans including impact and cost studies.
As you may recall from Alissa’s email that outlined some of the project activities, Phase 1 can be divided into four tasks that I can describe in fairly simple terms along with the associated research activities.
[see slide #7] Task 1 is to conduct a comprehensive review of all BCJI program sites from FY 2012 grants forward to document and describe characteristics of communities and partnerships, types of concerns and strategies developed, resources and challenges identified, and so on. The activities include reviewing information compiled by BJA from implementation plans, progress reports, and quarterly performance measures also known as PMT updates. As grantees, you're familiar with all these things; they've not gone to waste. In addition, NIJ's researchers will use an online version of the Violence Reduction Assessment Tool (also known as VRAT) to collect information from key site staff. Some of you may be familiar with this tool. It's a planning and support instrument designed specifically for sites implementing these kinds of approaches—that is, multiagency, partnership-based, crime-reduction programs. The components include governance and project management, partnerships, data and analysis, and feedback and awareness. Those map onto dimensions of effective implementation identified in prior research.
Moving on to Task 2, based on information available from Task 1, we plan to identify up to 15 sites in consultation with BJA for an in-depth examination to document and describe adherence to the core elements of BCJI's program, lessons learned from successes and failures, and other grantee experiences. Activities include site visits for field observation in the community, interviews with BCJI grantees and local stakeholders, and review of any grant work products such as research partner reports. Moving on to Task 3, for those 15 sites identified, we'll assess how well each may support more rigorous evaluations; again, I refer to impact and cost analyses as examples. Activities include drafting a program logic model that reflects current strategies and how those were established, and also assessing available information resources they may be used to develop future research. Important concerns for that would include comparison groups, program outcome measures, and variables necessary to control for external influences that may drive outcomes independent of the program design. So to review here for a moment, there will be no outcome or impact evaluation at the site level in Phase 1. NIJ's researchers will not be requesting transfer of actual data to verify the information managed.
Moving on to Task 4, this is a little different than your typical evaluation and so we're outlining here the various tasks involved.
[see slide #8] Our challenge is to evaluate TTA services. An important component of the program is this service and so we wanted to include both process and outcome evaluation to assess how TTA supports the development and implementation of BCJI programs across sites. Areas of interests include assisting grantees in general planning and implementation, supporting research partnerships for data collection and analysis, promoting criminal justice and community engagement, and helping sites address important concerns like sustainability. Similar to Task 1, activities begin with a general review of information compiled by BJA including progress supports and quarterly TTA reporting system documents also known as TTARS or TTARP (TTA Reporting Portal), as well as the review of TTA request and response records and any needs assessments or other information collected during the course of TTA service delivery. In addition, NIJ's researchers will observe any meetings or other events involving grantees such as the recently held workshop for grantees in Washington, D.C. The researchers will interview all of the staff who deliver TTA services to get their input, and conduct an online survey to solicit grantee input.
To summarize the project, we awarded a 24-month grant award to Dr. Hipple and her team.
[see slide #9] The team members include Jessica Saunders, her Co-Principal Investigator at Rand University and research assistants. They will conduct the four research tasks outlined for Phase 1, and document their findings in two work products due at the end of the project. One is an executive summary that's a little bit shorter and generally accessible to a broad audience, and the other is a more detailed research report that documents with appendices how things were done in the course of conducting the research. Those will both be released for public archive. Should NIJ and BJA pursue future research options, it's important that the information collected under Phase 1 are made public for fair and open competition. Once the final draft work products are ready, NIJ's researchers will be able to provide the sites with preliminary findings. We see this as a courtesy to our partners on this research project collaboration, and also an opportunity to solicit any feedback that would help us validate findings, our interpretation of findings, and recommendations. Over the life course of the project, Alissa and I will be looking for other opportunities to provide project updates and disseminate findings. We invite your suggestions to present at national and regional grantee meetings, and to help us find other ways to disseminate project information. That's about all the information I have to share, so I'm going to hand it over to Mary Jo to bring us into the next segment.
ALISSA HUNTOON: We’re still in the early stages, but we'll continue to update you as we move forward, and are open to your ideas on how best to do that as well. And as both Linda and I had mentioned before, we'll continue to look for future opportunities. We certainly will answer whatever questions you have at this time, but you can also email them to us or call us with questions in the future too. Does anybody has any questions thus far about the information provided?
MARY JO GIOVACCHINI: We're going to open up the floor to any questions that you might have. While we're waiting for questions to come through, some information will be helpful to you.
[see slide #11] A transcript of this webinar will be posted to the NIJ website. You will receive an email if you registered for the event, notifying you that that information has been posted and it'll have a direct link to the transcript, as well as the PowerPoint presentation. There's contact information here for both Alissa and Linda that is included on this slide.
LINDA TRUITT: One of the questions that we have so far regards the selection of the 15 sites, which is an excellent question. We have been thinking that what we will achieve in the first task—that is the general lay of the land of what are the different types of programs, the different target populations, and what are some of the strategies employed for those. What we'd like to do is called purposive sampling or selecting with a purpose those sites that vary on key characteristics. We’d like to find a set of programs that target different things so we can get lots of variety in different approaches and target populations. We’ll know from the first task where each of those sites stand in terms of how far along they are. If an implementation plan has just been submitted, but they haven't yet set up a program, it's a little harder to evaluate than another site that may be further along; although we may not exclude it from future research plans. Part of the exercise of Task 1 is to get as much information as we can in terms of where the program is at, in terms of its stability, and whether the information sources are available. Then we launch Task 2, which is really the process evaluation that drills down into what are those information sources. It is kind of a back and forth between identifying what are the needs identified from our audience, and what we've seen evolve from 2012 forward in different types of programs, where we're finding lots of different issues being addressed. A given site may target multiples issues, and some may just focus on one thing; they decide that's the one priority for them. We want to lay out what are the different approaches people are taking and then drill in and decide—what do we think are some strong options? Ideally we would pick one or two programs that do a certain thing, let's say topic A, and then we will put forward those as part of the 15 sites. We would then visit the sites, get more into what's going on with those sites for a careful scrutiny. Then we verify as best as we can that information is there, the program seems to have some legs, and this is something that we could pursue for a more robust evaluation in Phase 2. Not that we wouldn't consider other programs. Let's say two years from now, we decide we want to pursue an evaluation. We would of course need to refresh our understanding of what's going on with these programs. Things happen beyond our control—changes in staffing, administration, budget, funding, .. we all know how that can be. Potentially we would be open to refreshing that site list. This is our planning phase that other researchers may attempt to do within their project timeline; we're taking a step back and doing this as Phase 1.
MARY JO GIOVACCHINI: Can you provide a brief overview of the timeline?
LINDA TRUITT: The timeline is 24 months total. You can see that we're into April already. We're negotiating all the final paperwork to access all the information compiled by BJA so that the researchers are fully compliant with all the human subjects and privacy protection requirements. They then review those and are able to contact folks in the field to follow-up with any questions. Then over the summer, they will begin contacting the sites directly. As soon as possible, once the first round of information is in and reviewed, they will then follow up with that VRAT online survey. Then they segue into Task 2. At each of the steps, Alissa and I will update you to apprise you of what's happening, here's what you can expect, and these are the people who'll be contacting you. It could be anywhere up to six months to do process evaluations where they’re visiting each of the sites, making sure people are available to talk to, etc. Sites will be given clear expectations of when and what is needed, so they can prepare in advance, including for the evaluability assessment. Meanwhile, in parallel are the TTA evaluation activities that are themselves not always sequential. In addition to listening to this webinar, the TTA providers are meeting with NIJ, BJA, and NIJ’s researchers to talk about their expectations. We’ll talk about accessing the records that BJA compiles, how we're going to conduct interviews with them, and other information collection. Three months before the end of the research project, we will have ready to share drafts of the final work products. Working backwards, analyses and report writing would happen in that last summer and fall. The idea is that by September-ish, we have a really strong draft that we can disseminate for review.
ALISSA HUNTOON: Okay. We have a few more questions. The next one is, how will the work of the site's local evaluator be incorporated into this research plan?
LINDA TRUITT: That's another excellent question that we've had discussions about here, such as how we want to make clear that we are not in the position to request actual data. And by that I mean information identifiable to a person—who it is that you're treating, who's involved in your program, that sort of individual level stuff. There's not going to be a transfer of data from the researchers or the site to NIJ or the research team. It really is just understanding what is the implementation plan, how is the program logic model put together, and what kinds of analyses have you done. In the course of doing those work, each of the site researchers will have interim work products including analysis results in a report, a presentation, a memo, or however they package the information in that they're presenting to the community or other stakeholder group—that's what we'd like to see. We also thought about how we might have a separate project discussion for the partner researchers to review all those things. We want to make sure that even in the course of sharing those things, they're not inadvertently releasing something that may be considered sensitive. We would definitely negotiate with each of the sites to make sure that anything that they're releasing to us, they're comfortable with. If they have any issues, we can work that out. What we wouldn't do in this first phase is interfere if y’all are doing some sort of research projects, surveys, etc. We would absolutely make sure that we're not confusing the people who are in your survey pool with any information collection that we're doing. Let’s take the VRAT survey; we would want to make sure that people understand how cross sector team members are being hand-selected for that, and that it should not be confused with any other activities that your partner researcher might be doing. The research partner, the people we select for the survey, all should be clear so everyone's on the same page. Similarly, for any other information collection like the TTA provider survey, we want to make sure that everyone understands that is coming from NIJ’s researchers and human subjects protection requirements are being met. Please don't confuse this with some other exercise that may come from the TTA provider like their service feedback loop. We’re communicating directly with the TTA providers to make sure we don't generate that kind of confusion.
ALISSA HUNTOON: Okay. Another question, and this is a clarification just about what's currently being funded and potentially in the future and so the question is, the currently funded NIJ evaluation is just the Phase 1 evaluation, correct? Would the impact/cost evaluation be a separate RFP?
LINDA TRUITT: Yes, that is correct. We're using this as a planning phase to get as much current information as we can, sufficient to say, do we have a basis to go forward? Do we have the kind of local information to support more advanced research and activities—ideally a cost study which is so relevant to everyone in terms of sustainability and other things? We would then decide what would be the funding mechanism for that. Is it appropriate to do a contract? Is it appropriate to do a grant solicitation? Do we want to do a cooperative agreement? Those are future administrative decisions, and Phase 1 is about collecting the meaty information behind those. We will start with the information that we will have after Phase 1—are the programs diverse, settled, stable, and able to sustain an evaluation that requires that kind of intense work? It takes concerted effort to make sure everyone understands the time and information commitments, and that everyone's onboard with the idea of doing this. Most likely it would be a separate solicitation and announced around the end of this project. Typically, NIJ’s cycle is that you would see our solicitation come out in January of a given calendar year.
ALISSA HUNTOON: There is another question that sort of is related, but I definitely want to address it. It's a question around sustainability of these programs and what other sites are doing. That's obviously an important question that BJA partners with the TTA provider to help you think through at the very beginning. This research effort, of course, is going to be looking at models and how they're being sustained, and down the line we will put that information out. However, I would encourage individuals who have this question to reach out to LISC, your TA Lead. Have those conversations with them about what others are doing, as well as your peer network in BCJI, and take advantage of some of the tools there that are used to share among sites in terms of what other people are doing.
MARY JO GIOVACCHINI: At this time, I do not see any other questions. If you have submitted a question and it wasn't addressed, please submit it again using the Q and A and selecting all panelists or all presenters.
ALISSA HUNTOON: thanks so much to Linda and everyone for joining. Hopefully this answered some of your questions about this research. As we'd mentioned before, we will continue to provide updates to you, and this is not your only opportunity to ask questions. You have contact information for both Linda and myself, so please use those; email and call us with any questions that you have. You have the link to where this webinar will be archived as well. If you want to share it with your site partners that were unable to join, please do so. With that, thank you so much, everybody have a great afternoon.
Date Created: May 16, 2017