Website logo of Dr Krishna Athal Life & Executive Coaching

Smarter Care, Not Smarter Surveillance: My SJJTU Talk on AI, Ethics, and Student Wellbeing

   dr krishna athal

·

Today, I walked into the national conference at SJJT University carrying a familiar discomfort. Not stage fear. Something sharper. The discomfort of knowing our education system still worships marks, while students live inside messy, ungraded realities.

I opened my intervention with a question I have asked in classrooms, boardrooms, and coaching rooms: how many of us have taught, studied with, or known a student who looked perfectly fine on paper, yet was quietly struggling beneath the surface? That question is not rhetorical. It is diagnostic. It exposes the one gap we have normalised: we often do not notice until something breaks.

Marks measure performance, not the human being

Modern education is astonishingly good at measurement. We can rank, compare, standardise, and even convert curiosity into grades. Yet we remain clumsy with what truly drives learning: emotional safety, belonging, motivation, shame, hope, stress, and the quiet exhaustion that shows up as “lack of effort”.

In my talk, I held one idea firmly: academic performance is not the same as emotional wellbeing. A child can score high and still be drowning. A student can submit every assignment and still be burning out. The tragedy is that we call it “sudden” only because we were not trained, or equipped, to see the earlier signals.

Society has an odd romance with late intervention. We like dramatic rescues because they flatter the rescuer. Preventive care is boring, quiet, and unphotogenic. That is precisely why we need it.

What “AI for emotional and behavioural analytics in students” really means

Let’s clear a common fear. When I speak of AI for emotional and behavioural analytics in students, I am not selling a dystopia where children are monitored and labelled by machines. I am talking about detecting patterns over time that may indicate rising stress, disengagement, burnout, or social-emotional struggle, especially the kind grades fail to capture.

A grade is a snapshot. A student’s inner world is a film.

Emotional and behavioural analytics tries to read the film. It asks: what has changed across days and weeks? Is participation tapering? Is the student suddenly absent, not as rebellion, but as retreat? Has their written work shifted from coherent to clipped, from curious to flat? These are not sins. They are signals.

The purpose is not diagnosis. The purpose is visibility. Earlier visibility creates earlier, kinder intervention.

The data question: care, consent, and the ethics of “optional”

This is where people either lean in or lean away, and both reactions are healthy. In my intervention, I described two broad streams of data.

First, the data schools already have: attendance, assignment submissions, participation, engagement on online learning platforms, and the rhythm of interaction that good teachers sense but cannot always track systematically.

Second, optional affective signals, and I stressed the word optional: facial expressions, voice patterns, and wearables. These can offer early indicators of stress when used with informed consent, tight privacy practices, and clear limits. Without that ethical spine, the very same tools become surveillance disguised as innovation.

Technology is rarely neutral. It amplifies the values of the people designing and deploying it. If our values are control, we will build control. If our values are care, we can build care.

How AI reads change: patterns, not moments

AI is most useful when it stops trying to be a mind-reader and becomes a pattern-spotter.

It can analyse multiple indicators over time rather than single events. It can detect shifts in engagement, participation, and communication, then flag early warning signals, not as verdicts, but as prompts to look closer.

I offered a simple metaphor: AI is a smoke alarm, not a fire brigade. It does not put out the fire. It does not accuse anyone of starting it. It simply says, something is changing, pay attention.

Then I shared an example many educators recognise. A student begins submitting work late after months of consistency. Separately, their online responses get shorter. They withdraw from discussion boards. None of these is “big enough” to create alarm in a busy system. Together, they form a meaningful shift. AI for emotional and behavioural analytics in students can connect these dots faster than a human, especially when one teacher is holding 30-40 emotional worlds at once.

Behavioural analytics in action: protecting teacher judgement

A question I received afterwards was honest and important: Will AI tell teachers what to do? My answer was clear. It must not.

Behavioural analytics should provide actionable insights, summarise patterns across time, and help prioritise attention, while preserving teacher judgement. The educator remains the decision-maker. The psychologist remains the interpreter. The human remains the one who meets the student.

This matters because context is everything. A student flagged for disengagement might be caring for a sick parent. Another might be facing bullying. Another might be neurodivergent and overwhelmed. AI can highlight patterns. It cannot do meaning.

The institutional shift: from reactive schools to preventive cultures

At an institutional level, the promise becomes bigger and more sensitive. AI tools can support early identification of students at risk, help build personalised support plans, and offer dashboards that guide decisions without turning a child into a number.

But here is the societal question I placed on the table: are we willing to design education systems that care about wellbeing as much as results? Or do we want wellbeing only when it improves results?

If we treat wellbeing as merely a productivity tool, we will eventually make wellbeing itself toxic. Students will learn that even mental health is a performance metric. So yes, data-driven preventive interventions can be powerful. But we must protect the student from being reduced to a risk score.

Ethics: the line between support and surveillance

Ethics is not a side-note. In education, ethics is the difference between trust and trauma.

Privacy and informed consent are non-negotiable. Algorithmic bias must be actively prevented so we do not reinforce inequity. Human oversight must remain central because a model cannot understand culture, grief, poverty, or the complicated politics of adolescence.

I offered a simple litmus test: if a student would feel unsafe knowing the system is operating, the system is not ready. If an institution cannot explain the tool in plain language, the tool is not ready. If data informs decisions but begins to dictate them, we have crossed the line.

My closing reflection: smarter care, not just smarter machines

I ended with a thought that felt both technological and deeply human.

AI can illuminate what students cannot always express in words. It can highlight stress, disengagement, or emotional struggle that might otherwise go unnoticed. But technology alone cannot replace the heart of education. It is human empathy, understanding, and connection that turns insight into meaningful support.

If AI for emotional and behavioural analytics in students helps us see students more clearly, the real work begins after the insight. It begins in the teacher’s tone, the counsellor’s patience, the institution’s policies, and the courage to build preventive cultures that do not wait for children to break before we decide they matter.

author avatar
Dr Krishna Athal Life & Executive Coach | Corporate Trainer | Leadership Consultant
Dr Krishna Athal is an internationally acclaimed Life & Executive Coach, Corporate Trainer, and Leadership Consultant with a proven track record across India, Mauritius, and Singapore. Widely regarded as a leading voice in the field, he empowers individuals and organisations to unlock potential and achieve lasting results.

Comments

Leave a Reply

error: Content is protected!

Discover more from Dr Krishna Athal

Subscribe now to keep reading and get access to the full archive.

Continue reading