Wrong Software Was Used To Grade JAMB 2016 Examinations


Joint Admissions and Matriculation Board (JAMB) made use of the wrong grading software to mark the 2016/2017 Unified Tertiary Matriculation Examination (UTME),Association of Tutorial School Operators(ATSO) has alleged.

ATSO claimed that the error was responsible for the poor and conflicting results that were released by the board for the examination written between February 27 and March 17 in over 500 centres nationwide.

Speaking in Lagos on Wednesday, March 23, President of the group, Mr Shodunke Oludotun said the board used the software from last year to grade this year’s examination.

Oladotun also called for the resignation of the JAMB Registrar, Prof ‘Dibu Ojerinde.

Oludotun said: “We have our evidence to show that virtually all the candidates we have collected results of 2015/2016 and not of 2016/2017.

“This year, Prof Dibu Ojerindeadvertised 2016/2017 UTME – we all saw it. During his press conference, he also mentioned 2016/2017. During the exam, the students on their monitor, it displayed 2016/2017. Why is it that the result that was sent to the students showed 2015/2016?

“From our findings from insiders in JAMB, we realized that the software of 2015/2016 interfered with the 2016/2017, which led to the massive failure of the students. If you can see the trend of results from (February) 27 to 29, the students failed; (March) 7-15, the students failed massively.

“But we noticed that the 27-29 were compensated with 40 marks still under the interference of software. We can see that the 2015/2016 software was used to mark, that was why the students were receiving 2015/2016 results. So where is 2016/2017 result? That is what we are asking Prof Dibu Ojerinde…We are saying that Prof Dibu Ojerinde should step aside.”

He called for Ojerinde to step aside so someone else would build on the foundation he has laid to conduct hitch free examinations.

JAMB’s Director of Media and Public Relations, Dr. Fabian Benjamin has responded by saying focus should be on the programming and not the marking guide.

“I am not a programmer, but I can confirm to you that JAMB does not joke with its template. What happened with the 40 marks issue is because the scripts were marked based on 250 marks because only English Language is 100 while the other three papers carry 50 marks, making a total of 250.

“So when the first results were released, they were calculated based on 250, and after normalization we felt it would not be ideal for us to cheat on the candidates. So we had to quickly send them their real scores,” Benjamin stated.

Click NEXT For More Info