Study population

The OEI CBO Evaluation project focused on programs in nine counties in Ohio with large disparities in infant mortality and significant urban populations: Butler, Cuyahoga, Franklin, Hamilton, Lucas, Mahoning, Montgomery, Stark, and Summit. These include the urban areas in the cities of Hamilton, Cleveland, Columbus, Cincinnati, Toledo, Youngstown, Dayton, Canton, and Akron, respectively. Existing and newly developed community programs received funding to provide consistent data to ODM and could use this funding for hiring, training, and general program needs. The population involved in the development of data collection materials consisted of supervisors and program leaders from OEI CBOs, the research team at Ohio State, and the funders at the Ohio Department of Medicaid. The population assessed encompassed every participant who enrolled in one of the funded programs in the nine OEI counties. The OEI data collection system collects data only for those enrolled in one of the CBOs, but the population enrolled in these programs is largely high-risk, mostly minority women who have the highest need for interventions.

Prototyping the OEI data collection infrastructure: needs assessment and design

The research team at Ohio State for OEI consisted of faculty, skilled professionals, and technical experts tasked to develop, deploy, and evaluate the data collection system. First, in early 2018, our team conducted a needs assessment by reviewing previous program data collection materials, reviewing CBO grant applications, conducting individual and group interviews with CBO leaders, and reviewing data collection materials from similar programs identified during literature review to identify program needs and conceptualize a multimodal data collection system. The semi-structured interview guide asked open-ended questions about how organizations were managed, data collection and reporting methods, frequency of data collection, types of data collected, and abilities to add data points or change data collection methods. Based on our engagements with the CBOs, the research team sought to sketch out for the communities what a robust infrastructure might entail. The needs assessment occurred over a two-month period.

Twenty-one interviews were conducted in both individual and group settings, and input was gathered from 57 CBO leaders representing 60 of the 64 OEI organizations. Some leaders represented multiple CBOs, and there were four CBOs that did not participate in interviews due to non-response to interview requests. The research team met and reviewed interview transcripts to categorize responses.

Our needs assessment revealed that the CBOs that collect the most data have data collection activities at intake; each time a participant is seen; and, for some programs, after birth and when exiting the program. Data collection is done either during home visits with a participant, at group classes, or in other one-on-one settings such as public offices. Some programs collect most of the intake data at the first visit, while some collect it over a few initial visits. Our OEI data collection system largely mimics this, having forms for intake, birth, exit, and encounter with online and paper data collection options.

One of the major gaps noted by our needs assessment process was that without a system of this nature, there was excess variation in data collection approaches between CBOs. For example, initial engagement identified some community programs with advanced electronic data collection systems in place, while other CBOs worked entirely with paper records. We found that some CBOs collected only aggregated attendance data, while others maintained databases of extensive medical, behavioral, environmental, and demographic data. In addition, sufficient identifier information was often not collected, precluding post-processing and subsequent matching of data with community infrastructure.

The needs assessment revealed potential benefits of implementing a more robust system: the sponsors would receive more consistent data collection and program evaluation, while CBOs collecting minimal data would receive data collection materials and support to help with evaluation. Most CBOs had performed little evaluation work; employees lacked the time and resources, and the numbers of participants did not allow for robust analysis of outcomes. The absence of this system could be linked to implications for CBOs such as potential suboptimal allocation of resources, not targeting the individuals who may benefit most, and not providing as many referrals or interventions as a participant may need due to being unaware of risks.

The research team developed an initial variable list with the sponsors that was revised based on the needs assessment. Our initial variable list included variables measuring demographic data, environmental and behavioral risk data, and data about care received by the mother and infant. The demographic data collected included names, birth dates, and Medicaid or Social Security identification numbers to ensure that Vital Statistics and Medicaid data are matched properly, as well as address information to map where participants reside. The needs assessment revealed a desire from the CBOs to not have an overly burdensome number of data points. Most already collected much of the data and would not be able to significantly add more data points. Many clinical variables and birth-related variables that can be found in other materials were removed from the variable list to reduce data collection burden from participants. Many of the variables not collected are linked into the final dataset from Vital Statistics birth records, including birth weight, gestation at birth, and other health variables. The focus of the OEI data collection materials was to learn more about program participation, including its intensity and the services offered, as well as to learn more about the environmental and demographic factors that may play a role in infant and maternal health. A great deal of the data collected by the new materials are not available in any of the other data sources linked. Data collection materials were developed once the variable list was finalized. These variables would subsequently be collected and curated on a database server that concomitantly allowed for the querying of data through of a common data model. All data collection activities were approved by an Institutional Review Board at the Ohio Department of Medicaid.

FigureĀ 1 illustrates our vision for the OEI data collection infrastructure. The figure also shows how the various data collection, curation, and reporting activities are integrated within the infrastructure system. The first step of the system involves a CBO provider collecting information about a participant using one of the previously described collection forms (i.e., intake or birth). The subsequent step is for the provider to use a data entry mechanism to report the data to the research team. Transfer of data is possible through an online data entry system, scanning and faxing forms, mailing forms, secure email of spreadsheet data or forms, or uploading to a secure online portal; in our case, we have used the Qualtrics platform, which accepts data both input into a survey or securely uploaded as a file. The third step involves the curation of a database by the research team on a server that, in step four, can be used by the team to access data via a portal. Step five illustrates how the research team can use the curated database to develop a CDM that can be used by researchers and government agencies to develop queries for reports and dashboards. The CBOs can also use this information to interact with the researchers and government agencies for decision-making purposes. Challenges with character recognition applications led to the use of manual data entry.

Fig. 1
figure1

Vision for the Ohio Equity Initiative data collection infrastructure

Development of the OEI data infrastructure and common data model

Between February 2018 and July 2018, our team engaged in the development and deployment of the OEI data collection infrastructure. Critical early milestones during this period were developing an agreed upon timeline for data reporting, piloting data collection forms with select CBOs, and refinement of these forms. The Excel sheet and paper forms were sent to select CBOs that volunteered during their needs assessment interviews to review and pilot test the forms. These CBOs were encouraged to provide feedback on the specific variables measured, ease of use, language and readability, and general look of the forms. After editing the forms based on this feedback, final versions were developed and the online data portal was created using the same questions. Subsequent milestones focused on piloting the data collection system between July 2018 and August 2018 across all CBOs, testing system integration, validating data points, and making system refinements based on the feedback obtained from the previously listed activities. Data collection began in September 2018 for 30 programs, and additional programs were enrolled over time after training and data use agreements were finalized. The data use agreements allowed the organizations to share individualized data with the research team that are aggregated and evaluated before being presented to the sponsors or other programs.

During the development period of the data infrastructure, our team concurrently focused on the design considerations and development of the OEI common data model. The data model architecture for our project contains data about the OEI program, its participants, and non-OEI participants. The model consists of elements from our data collection infrastructure (i.e., OEI system) linked to datasets from state databases (i.e., Ohio Vital Statistics and Medicaid Claims). The online data collection system consists of a database using Qualtrics, which offers the ability to collect data about multiple participants and from multiple CBOs. CBO members create login information and are directed to a separate landing page for each CBO. On this landing page, participant demographic information can be added and surveys can be filled out and updated about each participant. These Qualtrics surveys follow the same format as the other data collection methods; CBOs create a record of a participant, then have the ability to complete intake, birth, exit, and encounter forms for that participant. These reports are downloaded by the researcher directly and processed and appended. CBOs can also securely send an Excel-readable spreadsheet that is imported and processed, or scanned paper forms that require an additional data entry step into a spreadsheet that is later imported. Another layer to the collection of data through the OEI system is the submission by some CBOs of data formatted for other data collection systems, in our case Care Coordination Systems (CCS) and the Ohio Comprehensive Home Visiting Integrated Data System (OCHIDS). CCS and OCHIDS contain similar data to the OEI system and required some standardizing before appending to the master dataset.

Training to use the OEI data collection infrastructure

CBOs were trained on how to use the data collection materials in July and August 2018. These training sessions consisted of webinars with demonstrations of how to use the Qualtrics data portal, the validated Excel spreadsheet, and the paper forms. After these demonstrations, CBOs were given a choice of how to submit data among those options. After this choice, the CBO was provided with paper forms, the spreadsheet file, or login information to start data collection. After the first month of data collection, the research team conducted the first round of three Plan-Do-Study-Act (PDSA) cycles. The purpose of the PDSA cycles was to gather information about barriers to data collection, answer CBO questions, help CBOs improve their data quality, foster relationships with CBOs, and further improve the data collection infrastructure. The PDSA cycles consisted of emailing all CBOs surveys and inviting them to short phone calls to discuss their concerns with data collection. Phone calls were optional, but they were especially encouraged for organizations that reported many data collection challenges in the survey. As part of these conversations, CBOs were given the opportunity to change data submission preferences, provide input about portal changes desired, and receive additional training.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Disclaimer:

This article is autogenerated using RSS feeds and has not been created or edited by OA JF.

Click here for Source link (https://www.biomedcentral.com/)