jeromedelisle consulting


delislejerome@gmail.com

  • HomeClick to open the Home menu
    • Evaluation Services
    • Educational Assessment Services
  • Professorial Lecture
  • Tobago PLN Project 2020Click to open the Tobago PLN Project 2020 menu
    • What's New and Video Viewing
    • School Clusters-Networked PLCs
    • Collaboration and networked learning
    • Action Research
    • Improvement Science
    • Fostering a culture of coaching
    • Clinical Supervision or Coaching?
    • Developing a Theory of Action
    • Tobago Issues - Parental Involvement
  • TOBAGO WORKSHOP 2014Click to open the TOBAGO WORKSHOP 2014 menu
    • Programme Structure
    • The Video Page
    • New-Student Assessment with Boys-What Works
    • Video Page 2
    • Worksheets
    • DAY 1-INTRODUCTION AND DATA LITERACY
    • DAY 2-EXPLORING THEORY & WHOLE SCHOOL APPROACHES
    • DAY 3-SOCIOCULTURAL & PEDAGOGIC INFLUENCES/INDIVIDUALS WHO STRUGGLE
    • DAY 4 - HELPING BOYS READ & WRITE
    • DAY 5-MANAGING MASCULINITIES
  • TOBAGO WORKSHOP 2013Click to open the TOBAGO WORKSHOP  2013 menu
    • Programme Schedule
    • Group Activities & Support
    • National Assessments & Public Examinations
    • The Item Writing Page
    • The Performance Assessment Page
    • First Steps-Formative Assessment
    • Giving Formative Feedback
    • Informal Formative Assessment
    • Reorganizing Your Class for Formative Assessment
    • The 21st Century Skills Page
    • The Reading Page
    • The PLC Page
    • The Male Underachievement Page
    • The International Assessment Page
  • Clinical Item Writing Workshop
  • Services
  • About Us
  • Contact Us
  • CAP Evaluation

Evaluation Services

What we offer

We offer a wide range of educational evaluation and programme evaluation services. 

We now specialize in educational evaluation (general and higher education) but we are keen to engage other sectors including evaluation in the health sector.

Why you should choose us for programme evaluation

We offer a range of evaluation models and methodological rigorous designs to provide credible evidence for imforming policy and future direction for your programme.

Our evaluations are strongly informed by theory, international evaluation standards, and innovative approaches. For example, we now include data visualization strategies for both our quantitative and qualitative findings.

We are especially versed in Theory-Driven, Case Study, Responsive, and Developmental Evaluation Models

We can employ different mixed and multiple methods evaluation designs, multiple case studies, quasi-experimental and even hierachical linear modelling.

Major Evaluation Contracts

2015

Technical Project Lead (Proposal Writer) - Curriculum Review and Implementation Plan-IDB funded evaluation of the new curriculum for primary schools in Trinidad and Tobago. Awarded to the SOE, UWI, St. Augustine.

2012

Evaluation of the Single Sex Conversion Programme. Agency-Division of Educational Research & Evaluation, Trinidad and Tobago Ministry of Education

2010

Evaluation of the Continuous Assessment Programme (CAP) in the primary school - Design and implementation of a Theory-Driven Mixed Method Evaluation (80 man hours –January to July 2010). Agency-Seamless Education Project Unit, Trinidad and Tobago Ministry of Education.

2009

An evaluation of the Secondary School Modernization Programme in Trinidad and Tobago and the development of a handover plan to the Ministry of Education of the SEMP projects. Agency-Secondary Education Modernization Programme Coordinating Unit. 

Major Publications and Presentations

Peer-Reviewed

2013

De Lisle, J. (2013). Exploring the value of integrated findings in a multiphase mixed methods evaluation of the continuous assessment programme in the Republic of Trinidad and Tobago. International Journal of Multiple Research Approaches, 7(1), 2-24. 

2012

De Lisle, J. (2012). Explaining whole system reform in small states using contextualized theory: The case of the Trinidad and Tobago Secondary Education Modernization Programme. Current Issues in Comparative Education, 15(1), 63-81

Conference Papers (Peer-Reviewed and non Peer-Reviewed)

2015

De Lisle, J (2015, April). Using canonical correlation analysis to study the complexity of continuous assessment practice in Trinidad and Tobago: Policy implications. Paper presented on April 14-18 at the 2015 American Educational Research Association (AERA) Conference Theme: “Conceptualizing Justice: The Peoples of the Diaspora Speak Out on Inequities in the Research of and on Their Cultures, Languages, and Heritage” Chicago, US.

 

2012

De Lisle, J. (2012, April). Evaluating the practice of continuous assessment in Trinidad and Tobago: The promise versus the reality. Paper presented on April 13 17 at the 2012 American Educational Research Association (AERA) Conference, Vancouver, British Columbia, Canada.



Recent Non Peer Reviewed

2015

De Lisle, J. (2015, November). Using developmental evaluation to improve a data centric intervention for reducing the gender gap in the schools of Tobago. Paper presented in symposium on K-12 educational evaluation entitled: Gathering, Using, and Collaborating through Data in Education at the 2015 American Evaluation Association (AEA) Conference on November 9th to 14th “Exemplary Evaluations in a Multicultural world” at Hyatt Regency Hotel, Chicago, Illinois.

De Lisle, J. (2015, November). Evaluation influence on policy and the role of the evaluator in a small island developing state: Confronting the challenge of scale Paper presented in symposium on evaluation use entitled: What role can evaluation play in influencing policy? Examples from four countries at the 2015 American Evaluation Association (AEA) Conference on November 9th to 14th “Exemplary Evaluations in a Multicultural world” at Hyatt Regency Hotel, Chicago, Illinois.

2014

De Lisle, J. (2014, October). Evaluation recommendations, politics, and program closure: The case of the single sex secondary school conversion program in Trinidad and Tobago. Paper presented in symposium on evaluation use entitled: “Using Mixed Methods to Make Hard Decisions: A Model and Two Studies” at the 2014 American Evaluation Association (AEA) Conference on October 15th -18th “Visionary evaluation for a sustainable future” at Colorado Convention Centre, Denver, Colorado.

Gayah-Batchasingh, A., & De Lisle, J. (2014, October).  An evaluation of preschool quality in a rural disadvantaged community in Trinidad: Investigating children's readiness for primary school. Roundtable presentation delivered at the 2014 American Evaluation Association (AEA) Conference on October 15th -18th “Visionary evaluation for a sustainable future” at the Colorado Convention Centre, Denver, Colorado.

Lucas, T. & De Lisle, J. (2014, October). Sustainability in shifting circumstances: Building evaluative capacity as a line of defence in the Trinidad and Tobago public sector. Roundtable presentation delivered at the 2014 American Evaluation Association (AEA) Conference on October 15th -18th “Visionary evaluation for a sustainable future” at the Colorado Convention Centre, Denver, Colorado.

Ramnanan-Mungroo, J. A. & De Lisle, J. (2014, October).  Using concept mapping to evaluate young test-takers' perspectives of a multi-use high stakes placement examination. Roundtable presentation delivered at the 2014 American Evaluation Association Conference on October 15th -18th “Visionary evaluation for a sustainable future” at the Colorado Convention Centre, Denver, Colorado.











 

 



 

Sample Abstract

Evaluation recommendations, politics, & program closure:
The case of the single sex secondary school conversion program in Trinidad & Tobago

•This paper applies an extends Eddy and Berry’s (2009) heuristic for program closure to an analysis of the events leading to the 2013 cessation of the single sex conversion program in the Republic of Trinidad and Tobago. This was a 2010 national intervention in 20 large secondary schools designed to reduce the overall persistent gender gap in educational achievement. A mixed methods evaluation with a quasi-experimental core component and a qualitative supplemental component was conducted in 2012. Although the overall results from the quasi-experimental component were neutral, findings from the qualitative component were unanticipated, and mostly negative. The analysis is framed from the perspective of the (1) role of the evaluator in managing (2) the social and political norms of a small nation state. Evaluation utilization becomes a challenge in this dynamic and multiplex context because it fosters institutional vulnerability and political sensitivity. Sustainable outcomes demand context sensitive evaluation practice.  (148)



Copyright 2009 jeromedelisle.com. All rights reserved.

Web Hosting by Turbify


delislejerome@gmail.com