Abstract

Methods

The Direct Verification Pilot took place in School Years (SY) 2006–07 and 2007–08. Six states implemented direct verification with Medicaid/SCHIP (DV-M) and participated in the pilot evaluation: Georgia, Indiana, Oregon, South Carolina, Tennessee, and Washington. Randomly sampled school districts in these states had access to DV-M and were surveyed by mail and online (2006–07, N = 85; 2007–08, N = 118). Respondents reported their experiences with and views about DV-M, and the time and costs spent on verification. Additional random samples of school districts provided FRP applications from families that did not respond to verification requests. Researchers matched these applications with Medicaid records to estimate the impact of DV-M on verification nonresponse.

Results

Over one-half of all school districts in the study used DV-M. The percentage of districts with positive attitudes toward DV-M varied across states. In SY 2007–08, 30% to 97% of districts using DV-M found it easy, and 61% to 100% would use it again. District attitudes were related to timeliness, design, and understanding of DV-M. The percentage of sampled applications verified with Medicaid/SCHIP ranged across states from 2% to 25%. This wide range was primarily due to differences in Medicaid/SCHIP income eligibility limits and whether both Medicaid and SCHIP data were used. Overall, DV-M verified 24% of applications from households not responding to verification. If DV-M verified at least 8% of sampled applications, districts saved time.

Applications to Child Nutrition Professionals 

Most states and school districts can improve verification, save time, and reduce burdens on families, if they implement DV-M.

Full Article

Please note that this study was published before the implementation of Healthy, Hunger-Free Kids Act of 2010, which went into effect during the 2012-13 school year, and its provision for Smart Snacks Nutrition Standards for Competitive Food in Schools, implemented during the 2014-15 school year. As such, certain research may not be relevant today.

Each year, school districts certify approximately 30 million students to receive free or reduced-price (FRP) school meals through the National School Lunch Program (NSLP) and the School Breakfast Program (SBP). Some students’ families submit applications for FRP meals, while other students are directly certified by the school district. Direct certification uses lists of children approved for the Supplemental Nutrition Assistance Program (SNAP, formerly the Food Stamp Program) and the Temporary Assistance for Needy Families (TANF) program to approve students for free meals without an application.

Applications for FRP meals do not require supporting documentation of eligibility at the time of certification, and the receipt of FRP meals by ineligible students is a significant problem. In School Year (SY) 2005–2006, the NSLP and SBP made overpayments of $710 million due to certification errors (Food and Nutrition Service [FNS], 2007). To reduce these errors, school districts must verify a sample (usually 3%) of approved applications for FRP meals. School districts usually verify FRP meals applications by obtaining documentation from the applicant’s family. This process burdens the family and the school district.

Direct verification is an option for school districts to streamline the verification of FRP meals applications by using information from other means-tested programs without contacting applicants. School districts use direct verification at the beginning of the verification process. After completing direct verification, they send letters to families whose applications still need to be verified.

The Child Nutrition and WIC Reauthorization Act of 2004 (P.L. 108-265) permits direct verification using data from SNAP, TANF, Medicaid, and the State Children’s Health Insurance Program (SCHIP). SCHIP serves uninsured children who would otherwise be over income for Medicaid. States may implement SCHIP as an expansion of Medicaid or as a separate program. Therefore, Medicaid and SCHIP information may reside in a single information system or in separate systems.

School districts can use direct verification for all sampled applications, including those based on income and those with categorical eligibility (using SNAP/TANF case numbers). Information that students are enrolled in SNAP or TANF verifies that their families have income at or below the eligibility limit for free meals, and therefore verifies any school meals application. In all states, the maximum Medicaid/SCHIP income limit exceeds the limit for free meals, so some Medicaid/SCHIP children are eligible for free meals while others are not. The district must carefully apply Medicaid/SCHIP information to determine whether it verifies free or reduced-price eligibility.

Information obtained for direct verification can not be used to change the school meal eligibility category established at the time of application. For example, SNAP can be used to verify an application approved for reduced-price meals but not to change to approval for free meals.

Direct verification has the potential to enhance program efficiency, reduce burden on families and school district staff, and reduce the number of students whose school meal benefits are terminated because of nonresponse to verification requests. In SY 2007–08, 32% of students sampled for verification lost benefits because of nonresponse (Ranalli, Harper, & Hirschman, 2009).

School districts can implement direct verification with SNAP and TANF (DV-S) using data provided for direct certification, but most SNAP and TANF children are directly certified and thus exempt from verification. Direct verification with Medicaid and SCHIP (DV-M) is more likely to be useful, because large numbers of Medicaid/SCHIP children are not enrolled in SNAP or TANF. DV-M may also open up the possibility for other data exchanges between school districts and Medicaid/SCHIP agencies, as discussed in the conclusion of this article.

The Direct Verification Pilot was the first test of the feasibility, effectiveness, and costs of using Medicaid/SCHIP data for direct verification. Six states – Georgia, Indiana, Oregon, South Carolina, Tennessee, and Washington – volunteered for the pilot project and gave their districts the option to implement DV-M.

Methodology

The Direct Verification Pilot took place in SY 2006–07 and SY 2007–08. The pilot evaluation was designed to answer these general research questions:

  • How do states and school districts implement DV-M?
  • What percentage of school districts use DV-M if offered the option?
  • What percentage of school meals applications sampled for verification is directly verified by Medicaid/SCHIP data?
  • Do school districts find DV-M easy and useful? Will they use it again?
  • Does DV-M result in fewer terminations of school meal benefits for families that do not respond to verification notices?
  • What are the potential cost savings from DV-M at the local level?

Of the six states included in the evaluation, four (Indiana, Oregon, Tennessee, and Washington) implemented DV-M in SY 2006¬–07 and five (Georgia, Indiana, South Carolina, Tennessee, and Washington) implemented it in SY 2007–08. The states independently developed methods for DV-M based on existing methods of direct certification. Interviews with state Child Nutrition (CN) and Medicaid agencies documented the implementation of DV-M.

The pilot states used three different methods of providing information to identify students selected for verification who were enrolled in Medicaid and to verify students’ FRP eligibility category. FRP eligibility may be confirmed with Medicaid/SCHIP information indicating family income and size, or family income as a percentage of the poverty guidelines, or the income range (free or reduced price). The methods were: queries, file downloads, and state-level matching. Georgia, Indiana, and Washington provided Internet-based query systems for districts to verify the eligibility of individual students. Oregon, Tennessee, and Washington distributed Medicaid data files to districts via the web. South Carolina collected districts’ verification sample data, matched it with Medicaid data, and distributed the results on disk. Indiana districts also could upload verification sample data for matching. Each state built on processes developed for NSLP direct certification. Only Indiana and Washington included SCHIP children in their data.

Each year, the researchers selected a random sample of school districts in states with DV-M for participation in the pilot study. These districts had access to DV-M and were asked to complete the Local Educational Agency (LEA) survey. The survey was nearly identical in the two years of the study and could be completed by mail or online.

The principal outcome measure was the number of applications directly verified, as a percentage of all applications sampled for verification. Therefore, districts were selected with probability proportional to size (PPS) based on the number of applications sampled for verification. In SY 2006–07, the sample comprised 121 of 884 districts in the four implementing states. In SY 2007–08, the sample comprised 130 of 954 districts in five implementing states. The response rates to the survey were 70% (n = 85) in SY 2006–07 and 91% (n = 118) in SY 2007–08. The response rate was higher in SY 2007–08, because the data collection period was extended, and because nonresponding districts were contacted by telephone to complete the survey.

The LEA survey collected information about district participation in DV-M, verification samples, direct verification results, and opinions regarding DV-M. The survey also collected the staff hours and costs of using DV-M, and the hours and costs of obtaining verification information from families. Tabulations of survey data at the state level used sampling weights according to the districts’ probabilities of selection, adjusted for nonresponse.

Telephone conversations with subsamples of districts gathered additional details on DV-M procedures and problems, and views about DV-M. In SY 2006–07, survey respondents from 15 school districts in Indiana, Oregon, Tennessee, and Washington participated in hour-long telephone forums. In SY 2007–08, telephone interviews (about 15 minutes long) were conducted with a purposively selected subsample of 11 survey respondents in Georgia and South Carolina.

A representative sample of school districts in Georgia, Indiana, Oregon, and South Carolina (n = 94) provided school meals applications from families that did not respond to verification notices (nonresponders) in SY 2006–07 (n = 1,993). Researchers matched these school meals applications with Medicaid/SCHIP data to estimate the potential reduction in benefit terminations for nonresponders. Tennessee and Washington were excluded because these states successfully implemented DV-M in SY 2006–2007. DV-M was not effective in Indiana and Oregon in SY 2006–07 because of data problems, so these states were included in the nonresponder data collection.

Results And Discussion

Use of DV-M
In both pilot years, over one-half of the study districts chose to use DV-M (Table 1). The average participation across states was 51% of districts in SY 2007–08, representing 52% of applications in verification samples. These percentages were higher in SY 2006–07, when Tennessee had 100% participation. Larger districts were as likely to use DV-M as smaller districts.

 

 Table 1

 District Participation in Direct Verification with Medicaid (DV-M)

SY 2006–07 SY 2007–08
  Pilot sample size Districts that used DV-Ma Pilot sample size Districts that used DV-Ma
 State Number Percent of districtsb Percent of appsc Number Percent of districtsb Percent of appsc
 Georgia N/A N/A N/A N/A 14 7 50% 63%
 Indiana 37 18 49% 53% 40 20 50% 51%
 Oregon 34 15 44% 42% N/A N/A N/A N/A
 South Carolina N/A N/A N/A N/A 21 9 43% 43%
 Tennessee 17 17 100% 100% 16 10 63% 54%
 Washington 33 19 58% 52% 39 19 49% 49%
 All states 121 69 63% 62% 130 65 51% 52%
 Note. Standard errors of estimates are not shown because the sample was designed to yield estimates of
the percentageof applications that are directly verified; it was not designed for estimates that are
percentages of districts. N/A = Not applicable. DV-M was not implemented.In SY 2006-07, 58 percent of non-respondents were contacted by telephone and confirmed that they
did not use DV-M.  In SY 2007-08, 66 percent of districts that did not respond by mail or web but
completed the survey by telephone did not use DV-M. Therefore, it was assumed that nonrespondents
to the survey did not use DV-M. b “Percent of districts” for each state is unweighted. For “All states,” this is the unweighted average of
state percentages. c “Percent of apps” is equal to the applications sampled for verification in districts using DV-M as a
percentage of all applications sampled for verification by districts selected for the study, weighted by
stratum and district sampling weights.N/A = Not applicable. DV-M was not implemented.

Open-ended survey responses indicated three main reasons why districts did not use DV-M in SY 2007–08: insufficient information, insufficient resources, and difficulty using DV-M. Most often, districts did not participate in DV-M because officials did not know that it was an option or did not understand that it could be used to verify any application. Some districts did not use DV-M because it was too difficult or not available when districts began the verification process. For example, school districts in South Carolina received direct verification match results six weeks after the scheduled start of verification. Smaller delays occurred in Indiana and Washington during SY 2006–07, but not in SY 2007–08. Some districts cited insufficient staff, low expectations of the payoff, or a combination of these factors.

Effectiveness of DV-M
There was wide variation among states in the effectiveness of DV-M (Table 2). The measure of effectiveness was defined as the percentage of applications in the verification samples that were directly verified with Medicaid/SCHIP. Among districts that used DV-M in SY 2007–08, the percentages verified were 1.8 in Georgia, 6.8 in Tennessee, 19.1 in Washington, 19.3 in South Carolina, and 25.2 in Indiana. DV-M was more effective for free than reduced-price applications in all states except Washington. SY 2006–07 results were similar to SY 2007–08 for Tennessee and Washington. Except in Georgia, DV-S was less effective than DV-M. Where DV-S was used, it verified between 2% and 7% of applications. (Effectiveness results for Indiana and Oregon in SY 2006–07 are not reported because data problems interfered with DV-M in these states.)

 Table 2

 Estimates of the Effectiveness of Direct Verification with Medicaid (DV-M), SY 2007–08

LEAs that used DV-M
 State Sample sizea Percent DV-M Standard error
 Georgiab
 All applications 2,137 1.8 (0.44)
 Free 1,513 2.2 (0.63)
 Reduced-price 624 0.9 (0.30)
 Indianac
 All applications 1,351 25.2 (2.05)
 Free 877 29.1 (2.62)
 Reduced-price 474 17.4 (3.21)
 South Carolinab
 All applications 1,086 19.3 (1.43)
 Free 758 22.6 (1.80)
 Reduced-price 328 11.9 (2.17)
 Tennesseeb
 All applications 616 6.8 (1.36)
 Free 396 8.6 (1.75)
 Reduced-price 220 3.7 (2.07)
 Washingtonc
 All applications 888 19.1 (1.98)
 Free 542 14.1 (2.19)
 Reduced-price 346 27.7 (4.26)
 Note. Estimates are weighted by district and stratum weights. LEA = Local Educational Agency

 aSample size is the number of applications sampled for verification in districts selected for the study.

 bGeorgia, South Carolina, and Tennessee used only Medicaid for DV-M.

 cIndiana and Washington used Medicaid and SCHIP for DV-M.

DV-M was more effective in states with higher Medicaid/SCHIP income eligibility limits, especially those that used both Medicaid and SCHIP data for DV-M. These factors increased the probability of directly verifying applications. Indiana, South Carolina, and Washington had Medicaid income limits of 150% of the Federal Poverty Guidelines (FPG) or higher. Indiana and Washington used SCHIP as well as Medicaid data, but South Carolina did not. These three states had rates of effectiveness for DV-M around 20%, but it is likely that DV-M in South Carolina would have a lower rate of effectiveness in the future. (Two factors made the rate of DV-M in South Carolina higher than would be expected in the future. Unlike the other states, South Carolina did not have an automated DV-S system, so districts verified applications with Medicaid that might otherwise be verified by SNAP. In addition, Medicaid data used for DV-M in South Carolina did not include income, so some children approved for free meals and directly verified may have been over-income for these benefits.) DV-M was least effective in Georgia and Tennessee, where SCHIP data were not used and the Medicaid income eligibility limit was 100% of the FPG.

Differences in the ease of DV-M may have led to the difference in effectiveness between Tennessee and Georgia. Most school districts in Tennessee considered DV-M “easy” or “very easy” on a 5-point Likert scale. In contrast, districts in Georgia frequently indicated they were confused about DV-M or found the system cumbersome.

DV-M was more effective overall in Indiana than in Washington (25.2% versus 19.1%), but Washington had more reduced-price children directly verified (27.7% versus 17.4%). These differences could be due to differences in the income distributions of Medicaid/SCHIP and FRP children, in the accuracy of DV-M, or other factors. There was no means of assessing the accuracy of DV-M.

Effect of DV-M on Non-Response to Verification
DV-M would reduce the termination of benefits for nonresponse to verification requests, but the predicted effect varied among the four states in this analysis. Overall, 15% of applications from nonresponding families in the four states were matched with Medicaid/SCHIP data (303 of 1,993), and 94% of matched applications would have been directly verified if DV-M had been used (i.e., the family had Medicaid/SCHIP income consistent with the approved FRP eligibility category). The lowest rates of predicted DV-M for nonrespondent applications were in Georgia (5.2%) and Oregon (8.7%), and the highest rates were in Indiana (23.5%) and South Carolina (23.6%). The difference in predicted DV-M rates appears mainly due to the differences in Medicaid/SCHIP eligibility limits.

The predicted DV-M rates for nonrespondent applications were based on a match with records that included “Medicaid/SCHIP-only” children and Medicaid/SCHIP children who were also enrolled in SNAP or TANF. Thus, some applications could be verified with DV-S. The predicted marginal impact of DV-M on verification nonresponse was the percentage of applications verified by Medicaid/SCHIP that could not be verified with DV-S: about 65% to 75% of the predicted rates reported above (4.0% in Georgia, 6.7% in Oregon, and 15.4% in Indiana). The Medicaid data for South Carolina did not identify SNAP/TANF children, so the marginal impact of DV-M on nonresponders could not be estimated for this state.

Attitudes Toward DV-M
District attitudes toward DV-M varied across states (Table 3). Among districts that used DV-M in SY 2006–07, the percentage that found it “easy” or “very easy” ranged from 67% to 93%. The lowest rating, in Washington, rose the next year after the state implemented changes to the DV-M system. In SY 2007–08, 86% or more of the districts found DV-M “easy” or “very easy” in the three states represented in both years of the study (Indiana, Tennessee, and Washington). Only 30% of districts in South Carolina and 56% in Georgia found DV-M “easy” or “very easy,” largely due to problems with the timeliness of DV-M, the clarity of communications, or the inherent limitations of the DV-M system.

 Table 3

 Attitudes toward Direct Verification with Medicaid (DV-M) among Districts That Used DV-M

SY 2006–07 SY 2007–08
  IN OR TN WA GA IN SC TN WA
 Rate DV-M as “easy” or “very easy” a 93% 77% 87% 67% 56% 93% 30% 86% 97%
 Rate DV-M as “useful” or “very useful” b 40% 60% 35% 44% 48% 59% 54% 86% 96%
 Plan to use DV-M next year 71% 72% 70% 74% 91% 100% 61% 86% 100%
 Number of districts c 18 15 17 19 7 20 9 10 19
 Note. Percentages are weighted by district and stratum weights. The sample was designed to yield
estimates of the percentages of applications that are directly verified, but it was not designed for
estimates that are percentages of districts. Therefore, standard errors of estimates are not shown. aRespondents were asked to rate ease of use on a Likert scale of 1 to 5, where 1 was
“very easy,” 2 was “easy,” 3 was “not easy or difficult,” 4 was “difficult,” and 5 was “very difficult.” bRespondents were asked to rate usefulness on a scale of 1 to 5, where 1 was “not useful at
all,” 2 was “not sufficiently useful,” 3 was “indifferent,” 4 was “useful,” and 5 was “very useful.” cSample includes school districts that used DV-M, regardless of whether they had any direct
verifications.

In SY 2007–2008, the percentage of districts rating DV-M as “useful” or “very useful” ranged from 48% in Georgia to 96% in Washington (5-point Likert scale). For districts using DV-M in SY 2007–08, the proportion expecting to use it in the next year ranged from 61% to 100%. Districts were less likely to view DV-M as useful or plan to use it in the future in the states with the significant implementation problems or low percentages of applications directly verified.

Time and Cost Savings
Direct verification can save time and costs for districts. On a per application basis, DV-M requires little effort: the average district spent 5 minutes per application sampled for verification. (Districts reported the total time spent on DV-M, which we divided by the total number of applications sampled for verification.) In contrast, the average time to obtain verification from a household was 71 minutes. Therefore, a district can save time with direct verification if more than (5/71) or about 7% of applications in the verification sample are directly verified. The overall effectiveness of DV-M exceeded this break-even point in Indiana, South Carolina, and Washington. In Tennessee, the break-even point was exceeded for applications approved for free meals, so districts might obtain cost savings if they used DV-M only for these applications.

Conclusions And Application

Direct verification provides a means to verify eligibility for FRP school meals using information collected and verified by SNAP, TANF, Medicaid, and SCHIP. It has many potential benefits: improving program efficiency, eliminating the burden of responding to verification requests (for some families), reducing the workload for school district staff, and reducing the number of students losing FRP meal benefits due to nonresponse to verification requests.

Six states participating in this study succeeded in implementing DV-M, and four implemented it for two consecutive years. State Medicaid Agencies were generally cooperative in providing data, sometimes modifying their systems to make DV-M possible. State CN Agencies built on their experience with direct certification and DV-S to distribute Medicaid/SCHIP data to school districts. Among school districts selected for the study, over one-half used Medicaid/SCHIP data for direct verification.

The percentage of districts with positive attitudes toward DV-M varied across states. In SY 2007–08, 30% to 97% of districts using DV-M found it easy, and 61% to 100% would use it again. Differences in district attitudes were related to the clarity of communications about DV-M, the design of the DV-M process, and whether DV-M was available when districts began verification.

The percentage of sampled applications verified by Medicaid/SCHIP ranged from 2% in Georgia to 25% in Indiana. The key reasons for this variation were: first, Medicaid/SCHIP income eligibility limits varied across states; second, some states used SCHIP data for DV-M while others did not; and third, the effectiveness of direct certification varied across states, affecting the pool of applications in verification samples. For the average school district, DV-M would result in time savings if at least 8% of sampled applications were directly verified. Regardless of whether the process reduces costs for the school district, direct verification of any application eliminates the burden of verification on the family and the risk that eligible children might lose benefits due to nonresponse.

DV-M is clearly feasible and appears to be cost-effective for some but not all school districts. It offers a “win-win” opportunity, reducing the burden of verification for both school districts and families. Furthermore, we estimate that its use in the study states could reduce benefit terminations for nonresponse to verification requests by between 5% and 24%.

Across the nation, all but three states have Medicaid/SCHIP income eligibility limits at or above 185% of the FPG. Thus, DV-M can include the full income range for FRP meals eligibility in 47 states and the District of Columbia. In 32 states, however, covering this income range requires that DV-M use data from SCHIP as well as Medicaid. In some pilot states, SCHIP and Medicaid data reside in separate systems, and integrating data from the two programs would be challenging. This is potentially an issue for other states as well. The benefit of including SCHIP will depend on the number of children participating in that program.

State CN agencies have a critical role in making DV-M possible. At a minimum, the state CN agency must establish a data-sharing agreement and make arrangements for the state Medicaid agency to provide data to school districts. In the pilot states, DV-M was more widely used and effective where the state CN agency did the following:

  • assured that the Medicaid data were complete and included sufficient information to identify children and verify income,
  • created an easy-to-use system with a familiar interface,
  • actively promoted the system and provided clear instructions on how to use it, and
  • made the data available before October 1, when verification notices usually are sent.

States can strengthen direct verification by creating an integrated system for DV-M and DV-S, and by providing the capability for both queries for individual NSLP applicants and batch matching of verification sample data with Medicaid/SCHIP data. In the pilot, small districts found it easiest to look up individual NSLP applicants in a database of Medicaid/SCHIP children. Large districts often found individual look-ups time-consuming and wanted a file matching process.

State agencies have a range of options for how to implement DV-M, as illustrated by the different systems used by the pilot states. In SY 2007–08, three of the five states (Georgia, Indiana, and Washington) provided Internet-based systems for DV-M with the capability for queries to verify individual students. In addition, Indiana enabled districts to upload verification sample data for matching, while Washington enabled districts to download Medicaid/SCHIP data. Tennessee distributed Medicaid data via the web for district-level look-ups and matching, while South Carolina matched districts’ verification sample data with Medicaid data and distributed the results on disk. Each state built on methods and systems developed for NSLP direct certification.

Interviews with state officials suggest that the level of effort to implement DV-M at the state level was modest, but lack of available technical staff delayed DV-M in several states. The Food and Nutrition Service (FNS) has provided grants to states for enhancing direct certification and direct verification (FNS, 2008). Two states in the study used these grants, and other states may be able to obtain similar grants to address resource constraints.

At the school district level, the first step toward using DV-M is getting information on state plans and encouraging the state to implement it. Where DV-M is available, local CN officials can learn how to use it through state training and by talking to other districts that have used it. Staff members responsible for verification need to be trained and authorized to use DV-M. Once these preparations are in place, school districts can maximize the time available to complete household verification if they draw their verification samples and conduct DV-M before October 1. Local CN officials may need to work with local information technology (IT) staff to make the best use of available data systems. IT staff may help create a verification sample file for matching, access student records and the database of FRP students, match Medicaid/SCHIP data to district student records, and update the FRP student database with direct verification results. Last but not least, local CN officials may find that the DV-M process increases effort for verification in the first year but will save time in later years.

DV-M implementation establishes a data-sharing relationship between education and Medicaid/SCHIP agencies. This relationship may provide other benefits to the agencies and low-income families. If authorized, Medicaid eligibility information could be used by school districts to identify administrative and service expenses that Medicaid will reimburse. Also, data from school meals applications may be used for “express lane” enrollment of students in Medicaid or SCHIP, as authorized by the SCHIP Reauthorization Act of 2009 (Dorn, 2009).

Two lines of future research on DV-M are necessary. First, more information is needed on implementation barriers. Other than the states in the pilot study, only North Carolina has implemented DV-M. Seven others have plans to do so: Arizona, California, Massachusetts, Nebraska, Pennsylvania, Texas, and Wisconsin. With a better understanding of why other states have not adopted this win-win approach, FNS can develop solutions, such as additional funding or technical assistance. Second, more research is needed to independently examine the accuracy of DV-M. All DV-M systems have the potential for error, either in computer matches or in the use of computer data by local employees. Knowledge of the extent of false positives (verification in error) and false negatives (failure to verify when the data support verification) will help validate matching results and enhance the process of direct verification.

Acknowledgments

This article is based on research funded by the Food and Nutrition Service, U.S. Department of Agriculture, under Contract number AG-3198-D-06-0060 with Abt Associates, Inc. A complete description of the research and the results is available in Logan, Cole, and Hoaglin (2009). The authors gratefully acknowledge the assistance of several reviewers at FNS, the helpful comments of the journal editor and reviewers, and the cooperation of the state Child Nutrition Agencies and school districts that participated in the study.

References

Dorn, S. (2009). Express lane eligibility and beyond: How automated enrollment can help eligible children receive Medicaid and CHIP. A catalog of state policy options. Retrieved June 4, 2009, from http://www.rwjf.org/files/research/autoenrollmentfinalapril2009.pdf

Food and Nutrition Service. (2007). Erroneous payments in the National School Lunch Program and School Breakfast Program: Summary of findings. Retrieved November 18, 2008, from http://www.fns.usda.gov/oane/MENU/Published/CNP/FILES/APECSummaryofFind.pdf

Food and Nutrition Service. (2008). Memorandum to State Child Nutrition directors: FY2008 direct certification and direct verification grants. Retrieved November 18, 2008, from http://www.fns.usda.gov/cnd/Grants/FY08certgrant_cover.pdf

Logan, C., Cole, N., & Hoaglin, D. (2009). Direct Verification Pilot Study Final Report. Retrieved October 19, 2009, from  http://www.fns.usda.gov/ora/MENU/Published/CNP/FILES/DirectVerificationYear2.pdf

Ranalli, D., Harper, E., & Hirschman, J. (2009). Analysis of Verification Summary Data, School Year 2007-08.Retrieved April 5, 2010, from  http://www.fns.usda.gov/ora/MENU/Published/CNP/FILES/CNVerification2007-08.pdf

Biography

Logan is a Senior Associate at Abt Associates, Inc. in Cambridge, MA. Cole is a Senior Researcher at Mathematica Policy Research Inc. in Cambridge. Kamara is a Senior Researcher at U.S. Department of Agriculture, Food and Nutrition Service, Office of Research and Analysis located in Alexandria, VA.

Purpose / Objectives

The Direct Verification Pilot tested the feasibility, effectiveness, and costs of using Medicaid and State Children’s Health Insurance Program (SCHIP) data to verify applications for free and reduced-price (FRP) school meals instead of obtaining documentation from parents and guardians.