The Education Research Center (ERC) is composed of an interdisciplinary group of researchers with expertise in academic achievement, special education, teacher education, and teacher performance.

Optimal’s ERC team uses the full range of innovative and effective research methodological techniques including:

  • Randomized field trials
  • Classroom observations
  • Structured and semi-structured interviews
  • Focus groups
  • Surveys

The staff also utilize state-of-the-art technology and analytic tools such as SAS and Stata statistics software packages to analyze both small and large datasets from secure servers that are compliant with the U.S. Department of Education security standards.

Optimal’s ERC team has subject matter expertise in these areas:

  • Academic achievement
  • Charter school financing
  • Charter school performance
  • Early childhood development
  • Performance measurement design
  • Policy (e.g., No Child Left Behind)
  • Summer learning
  • After school programs
  • Special education
  • Title I and IV compliance
Sample Past Performances:

National Center on Service Obligations (NCSO)

For this contract, Optimal monitored grant recipients and reports findings relating to GPRA measures that speak to the success of this grant program. The NCSO team developed reports and protocols to address GPRA and PART requirements. The project team also developed the website for this project that provides an overview of the grant program and regulations and updates this website annually, similar to the requirements of task 8 for the IFLE contract.

Optimal designed and implemented a real-time, secure, ED-accredited web-based data collection tool, automating the data collection, analysis and reporting of Personnel Development Program data through the Service Obligation Tracking System (SOTS). As a result, NCSO staff focuses on the analysis and review of data to supplement reporting efforts for to ED. To combat the volume of scholars and to better facilitate communication among grantees, scholars, and employers, Optimal designed and maintains the SOTS, including its multifaceted features of automated sending of e-mails, tiered access and unique user environments with varied permissions and end-user types, automated reports, and accommodation for GPRA program assessment measurement reporting.

Education Statistics Support Institute Network (ESSIN): Assessment Division Support

Optimal works closely with other ESSIN partner organizations to review NCES products, develop complex methodological frameworks to analyze educational trends, develop robust visualization to illustrate findings, and overall provide policy solutions to relevant stakeholders. Optimal plays a key role in the reporting and dissemination of the Nation’s Report Card and conducts policy relevant analysis on what students know and can do in the nation.

Tasks in this project include collecting and standardizing data in different formats and conducting analysis based on the standardized data. For ESSIN, the Analytics team used a customization of the Real-time Framework using Excel VBA code that allowed them to take large volumes of NAEP Data Explorer (NDE) output/data (in Excel)) and transform it into tables and line graphs (as opposed to the ESSIN team having to build hundreds of tables and line graphs).
Evaluation of the Graduate Nurse Education Demonstration (GNE), Phase I

The Graduate Nurse Education (GNE) demonstration, authorized by the Affordable Care Act, aims to increase the supply of APRNs in the U.S. health care delivery system by providing Medicare payments to five selected hospitals for the reasonable cost of providing clinical training to APRN students. This demonstration also contributes to the creation of partnerships between hospitals, schools of nursing (SONs), and community-based care settings (CCSs).
Optimal and its subcontractor, AIR, designed and implemented a program evaluation to inform the demonstration’s Report to Congress (RTC). The evaluation design includes a mixed methods approach to executing a structure and process evaluation of how well the sites implemented the demonstration—including challenges, successes, and innovations—through qualitative interviews and focus groups.

The design also includes an outcome evaluation based on the short-term outcomes of the first two years of the demonstration. The data collection and analysis addresses the following main research questions in addition to several related questions:
1. The growth in the number of APRNs resulting from the demonstration with respect to the base years.
2. The growth for each of the following specialties: clinical nurse specialist, nurse practitioner, certified nurse anesthetist, and certified nurse midwife.
3. The costs to the Medicare program that result from the demonstration.
The evaluation includes collecting and analyzing primary and secondary data to examine issues pertaining to cost and participation in this demonstration, including the changes made within the five eligible hospitals and multiple partners. Primary data will be collected through focus groups, site visits, and a series of demonstration-site reports. Secondary data will come from the CMS hospital billing system, eligible hospital cost reports, and partners of the eligible hospitals.
Early evaluation findings have revealed the challenges and nuances of the demonstration—such as a lack of baseline data or difficulty accessing baseline data at demonstration SONs, the lag time in receiving actual cost information from CMS audit contractors, and variations in the application of CMS’ payment methodology and other network model attributes—which have been addressed in the Optimal team’s approach to phase II.
Take a look at some of our previous work.

For more information, e-mail us at