When CBSE announced its On-Screen Marking (OSM) paper-checking process for Class 12 in February 2026, the controller of examinations, in his first round of interactions with educators across some 31,000 schools, noted that this year examiners would experience a “no touch, no dust — clean evaluation.”
As lakhs of students await the CBSE Class 12 results 2026, feedback emerging from evaluators and those familiar with the process presents a mixed but revealing picture of the Board’s first large-scale shift to a digital paper correction system.
“There was a lot of reluctance in the beginning, as is against most changes,” said Sapna Charha, headmistress of Modern Public School, Shalimar Bagh, referring to the initial response among evaluators adapting to screen-based assessment. However, educators across schools also suggest that most teachers gradually became comfortable with the system after orientation and training sessions conducted before and during evaluation.
The new ‘digital evaluation’ system now sits at the centre of expectations around the declaration of CBSE 12th results 2026. While the Board has not officially confirmed a date and time, the structural changes introduced this year have raised the possibility that CBSE results could be released sooner than usual.
At the core of the expectation that Class 12th results could be announced sooner this year are the timelines outlined by the Board itself. CBSE Controller of Examinations Sanyam Bhardwaj, while detailing the rollout of OSM a few months back, right before the commencement of CBSE 12th exams, had indicated that the evaluation process is being compressed significantly. The first day, he noted, would be used to discuss the marking scheme and conduct a mock evaluation, after which “the evaluation will then proceed for about 8 days, with marks uploaded every evening.”
More importantly, he emphasised that “the evaluation for Class 12 is expected to be completed in much fewer days than the earlier 60 days,” with regional offices being asked to wrap up the process in about 9 days instead of 12 — saving nearly three days in each cycle. The broader goal, he noted, is “to declare results on time, and possibly much before the time in the future.”
What is different this year that makes a significant shift
Story continues below this ad
The debut of CBSE’s ‘OSM Onmark portal’ marks a major shift from the earlier system, where physical answer sheets moved across centres and layers of checking and cross-checking through extended timelines. In contrast, the OSM system enables answer scripts to be digitally distributed across evaluators centrally, within each designated centre. “When an individual examiner completes one evaluation, they will fetch another answer book from any school assigned to that zone,” Bhardwaj had explained at the time.
The OSM system ensures that no question remains unevaluated — eliminating totalling and counting errors while maintaining detailed logs from script-level evaluation time to review history and mark distribution.
Charha, of the Modern Public School, Shalimar Bagh, suggests that while the system is robust in design, its impact on timelines may not yet be dramatically visible. “I am not able to see any speed in the actual evaluation or checking system if I compare OSM versus the traditional,” she said, adding that subject-wise evaluation continues in phases – physics, mathematics, and so on, much like previous years.
Some evaluators in other states also pointed to technical slowdowns during the initial phase of implementation. According to an Indian Express report, teachers involved in the evaluation process described instances of blurred scanned answer sheets, slow-loading scripts, and intermittent server issues, suggesting that parts of the expected efficiency gains may have been offset during the first large-scale rollout of the system.
Story continues below this ad
Priyadarshini Mane, principal of VIBGYOR High Balewadi in Maharashtra’s Pune, said teachers from her school who participated in evaluation for subjects such as mathematics and physics returned nearly two days sooner compared to the earlier manual process.
According to her, automated tabulation and the elimination of physical handling have reduced portions of manual workload, even if the broader impact of the system may become clearer only over the next few cycles.
Evaluators, according to the report, also noted that revised instructions issued during the correction cycle occasionally required already-evaluated answer scripts to be reopened and checked again. Some teachers said this added to the time taken during the early stages of digital evaluation, particularly while adapting to the portal-based workflow.
The daily evaluation load, too, remains comparable. “The maximum number of sheets was 25 that teachers were supposed to check in a day,” Charha said, with working hours continuing from 8 am to 4 pm. The key difference lies in how work is distributed and tracked; each evaluator logs into the portal individually and is assigned a fixed set of answer sheets, with no scope for redistribution among peers.
Story continues below this ad
She explained that in the earlier process, quicker evaluators could pick up additional answer sheets from colleagues who took longer to complete corrections. Mane also pointed out that while the Board appeared to have selected centres keeping infrastructure readiness in mind, internet instability could remain a practical challenge during evaluation.
Infrastructure disparities also remain a concern. While some schools are well-equipped with computer labs and stable internet, others may not be. “In my school, infrastructure was not a problem, but I am sure there must be many schools which could not support such strong infrastructure,” Charha said, calling internet reliability ‘an unpredictable thing,’ even as she acknowledged that “the portal itself is beautifully designed, and is easy for teachers to understand, familiarise and pick up on.”
ALSO READ | Students demand clarity as Class 12 moves to on-screen marking, days before exam
Where OSM clearly delivers, however, is in accuracy and monitoring. The system does not allow submission unless every question is evaluated or marked as ‘NA’, and can even detect when answers are written out of sequence, Charha elaborated. Totalling errors are eliminated, and record-keeping is automated — addressing long-standing issues in the traditional system.
Story continues below this ad
Still, the transition has not been without friction. “I think the Board could have given hands-on training somewhere in the middle of the session, rather than near the exam commencement,” Charha remarked, pointing to initial hesitation among teachers.
Mane acknowledged that the first year of implementation is likely to involve a learning curve before the system’s full efficiency becomes visible. She said, “The exact efficiency gains of on-screen marking may only become clearer once teachers become more accustomed to digital correction workflows in the coming years.” However, she describes the transition overall as ‘a step in the right direction.’
Charha also flagged increased screen time as a drawback. “When you are constantly sitting in front of the computer, your eyes get tired,” she said, highlighting a practical challenge that comes with digital evaluation.
Beyond screen fatigue, some schools also reported operational strain during the evaluation period, the report notes, with senior teachers being repeatedly assigned or recalled for board duties. Educators noted that prolonged evaluation schedules occasionally affected regular classroom routines for Classes 11 and 12 during the assessment cycle.
Story continues below this ad
When asked about timelines, Charha remained cautiously optimistic about any possible result date. “It appears that it should give the result faster as the overall timelines of handing out answer scripts, allocation of evaluation centres, and the delay in transportation have been done away with,” she said.
Mane, too, felt that while the system would likely become more efficient and trustworthy over time, on result timelines she said: “the overall result declaration window this year may remain broadly similar to previous years, given that this is the first large-scale rollout of OSM.”

