Transfer and Roll Out and Evaluation

Each partner was free to run different approaches to the roll out, considering their different country’s social and economical context.  All partners provided information on their planned approach/scheme of work, the number of participants, dates of activity, and how they used their mentors.

For this part of the NESSIE Project, evaluation was defined as a “… systematic application of social research procedures for assessing the conceptualisation, implementation and utility of social intervention programmes.”  The focus was on how to evaluate the impacts of the course and not to evaluate the learning material itself.  We had to see the bigger picture, step back and see the planned impacts on society (entrepreneurs, employers etc).

Impacts were defined as both short and long term for employment sector and job seekers.  Long term impacts (such as increased success of participants in job search, improved opportunities for advancement at work, greater job satisfaction, improved performance of SMEs/public sector services) are difficult to measure & are not recorded during the life of the Project. This evaluation therefore seeks to measure short term impacts on trainers/mentors, employees in the labour market and job seekers.

In order to finalize the questions to be used, the consortium discussed descriptors (actions we can observe to see that they have acquired the learning) and indicators (the grades of answers/reactions from the participants).  Also the balance of combining qualitative and quantitative measures.  To ensure fulfillment of stated Project objectives, it was agreed that the questions should be taken from the indicators in the original Project application.

There was some discussion about when we should ask the users these evaluative questions.  If we left it too long we could lose some of the people who participated.  The reflective account is the final piece of activity that the learners have to do six weeks after the learning and it was agreed that the questionnaire should be part of that.  However, it is clear from some of the comments that some users completed the evaluation immediately after working through the course, without the period of reflection.

Initial data was collected from participants as they registered to the e-learning site.  The evaluation questionnaire was structured within the Planning and Organising part of the course.  It should be noted that the course was originally designed to be customized by each individual learner only undertaking the parts of the course that are relevant to themselves individually, depending on their own soft skills strengths and weaknesses.  However for the purposes of this evaluation, some users were asked to complete all sections of the course regardless of personal relevance.

Print Friendly

Leave a Reply