Research and Evaluation

Optimal Solutions Group is dedicated to providing real-time data and solutions to decision-makers dealing with rapid change.

Instead of looking back on why a project succeeded or failed, Optimal’s data evaluation and analysis services are designed to provide continuous feedback that allows for course corrections even as a program is ongoing. We call this design that allows for concurrent data collection, analysis and reporting our “Real-Time Framework.”

With a focus on multidisciplinary design, Optimal is experienced in rigorous methods, has proven technologies, and abides by human subjects, security and accessibility guidelines to provide timely, evidence-based reporting for programs, partners, and public stakeholders.

Our in-house staff of social scientists have expertise spanning diverse disciplines, including workforce development, welfare reform, education policy, and child welfare and economic development policy.

 Download Optimal’s Capability Statement

Policy and Program Evaluation

  • Rigorous impact evaluation using experimental, quasi-experimental, and non-experimental design to create evidence of what works to inform program and activity design
  • Performance evaluation, monitoring and evaluation, and continuous quality improvement analysis to provide assessments of performance within the program cycle
  • Rapid-cycle evaluation to provide real-time updates on program performance
  • Process evaluation to understand the scope and design of projects
  • Implementation analysis to assess whether the treatment has been delivered with fidelity to its original design
  • Environmental analysis, climate-risk screening, and sustainability analysis to assess the impact of climate change
  • Political economy and governance analysis to assess the state of the democratic process
  • Culturally competent, gender-inclusive, and sustainability-minded approach to evaluation design
Statistical and Econometric Analysis
  • Multiple linear regression modeling
    • Cross-sectional, time-series, and longitudinal/panel-data modeling
    • Hierarchical linear modeling
    • Regression discontinuity
    • Propensity score matching
    • Instrument variable design
  • Experienced in using SAS, STATA, R, SPSS, and Python
Qualitative Analysis
  • In-depth structure, semi-structured, and unstructured key informant interviews
  • Focus groups
  • Literature reviews and environmental scans
  • Systematic review of data and documentation to support desk reviews
  • Case studies
  • Experienced in using NVivo and Atlas.ti software to facilitate qualitative analysis
Machine Learning and Artificial Intelligence
  • Supervised and unsupervised machine-learning algorithm training for pattern recognition and predictive analytics
    • Natural language-processing algorithm training
    • Image recognition algorithm training
    • Decision tree models for feature selection and outcome prediction
  • Experienced in using Neural Networks, Python, Torch, Café, Tensorflow, Keras, Google Cloud Platform, and Amazon Web Services
Economic, Cost, and Investment Analysis
  • Ex ante, ex post, and in-medias res cost effectiveness and cost-benefit analysis
  • Deterministic economic modeling with complex Markov chains and Monte Carlo simulation design
  • Public finance management
  • Public sector investment analysis
  • Return on investment analysis
Large-scale Survey Administration
  • Develop multi-stage, complex survey methodology with clustered and stratified random sampling design
  • Develop, test, and pilot data collection instruments
  • Recruit, train, and lead large survey data collection teams
  • Use real-time survey tools within the ReveloTM suite that allow offline tablet and mobile-based data collection
  • Use real-time survey tools to check skip patterns, track geographic locations, and monitor data entry for outliers, thus providing quality assurance and timely course corrections in the field
  • Reporting progress of data collection in real-time to stakeholders
Knowledge Creation, Management, and Dissemination
  • Develop standards for data quality, completeness, and documentation
  • Develop tool kits and guidance documentation for data collection, preparation, and submission
  • Develop standardized data-collection tools to be used by communities of research practice
  • Develop technical briefs that synthesize primary project and activity-level data and research findings to inform strategy level performance
  • Lead trainings, capacity-building workshops, and professional development activities to train stakeholders on how to correctly use data and evidence from the field to inform policy decisions
  • Develop centralized, web-based data and knowledge product repositories
  • Disseminate knowledge products through the web
Communications services
  • Conference planning
  • Publication preparation
  • Web design and development
  • Database development and maintenance
  • Information dissemination
  • Video conferencing and webinars