04/23/2026 | News release | Distributed by Public on 04/22/2026 19:49
The rise of artificial intelligence has often been framed as a challenge to the humanities and social sciences. But speakers at "Future Horizons: Envisioning the Humanities and Social Sciences", recently organised by the NUS Faculty of Arts and Social Sciences (FASS) as part of the Ideas Festival Singapore 2026, argued the opposite: That a world transformed by AI will need these disciplines even more urgently than before.
In his welcome remarks, Professor Lionel Wee, Dean of FASS, framed this debate within the broader context of rapid technological advances, intensifying climate crises and growing social fragmentation. In such a moment, he said the humanities and social sciences must continue to shape the future, "but it cannot be by maintaining the status quo."
Throughout the afternoon, speakers from academia, philanthropy and the arts returned to a shared conviction: as AI expands what machines can do, societies will need the humanities and social sciences more than ever to interpret fast-changing realities, interrogate ethics and power, and preserve human agency.
Navigating the AI deluge: Judgment, uncertainty and bias
In an AI-saturated environment, judgment, interpretation and the ability to recognise uncertainty become increasingly important. AI can generate and organise information at scale, but that information is not the same as understanding.
Historian Professor Adam Clulow from the University of Texas at Austin pointed to the scale of today's information environment. He said, "A conservative estimate is that AI generates 1 trillion words of content every day."
He noted that the key skills needed in jobs in this environment are those cultivated in humanities classrooms, where students are trained to evaluate information sources, assess bias and context, work through conflicting interpretations, and document uncertainty and ambiguity.
"We must walk this winding and difficult road ourselves if we wish to be the sort of intellectual agents capable of reacting appropriately to the modern world." - NUS Provost's Chair Professor of Philosophy Qu Hsueh Ming
That same critical lens is needed for fairness and representation. Social scientist and internet geographer Professor Mark Graham from the Oxford Internet Institute at the University of Oxford described today's world as "a synthetic society" shaped by humans, platforms, models and datasets. He cautioned that these systems can look objective, but patterns emerge when examined at scale, adding that when one closely examines the patterns, what emerges is systemic bias.
Prof Wee observed that the issue predates AI: "The value accorded to insights was never democratic…Some categories of humans were not considered humans." He pointed to how societies have long decided whose voices count and whose do not. As AI systems have been trained on historical records and online data shaped by those inequalities, they can end up reproducing the same exclusions.
Against this backdrop, the humanities and social sciences are indispensable, not because they can make AI perfectly objective, but because they train people to question what appears neutral and how AI encodes assumptions about whose voices count.
Underscoring this point, Prof Graham said, "What we can offer is not just correcting AI, not just making it more accurate…more objective, that's impossible, but rather a totally different epistemology…one that foregrounds plurality and positionality and power."
Writing and creating still matter as they shape how people think
Speakers also pushed back against the idea that writing and creativity are simply outputs that AI can replace. Instead, they described these processes as humanities and social sciences practices that shape how people reason and make meaning.
NUS Provost's Chair Professor of Philosophy Qu Hsueh Ming stressed that writing is not only communication but also cognition. He said, "Writing is not only a way for us to communicate ideas, but also a way for us to organise and systematise ideas."
He noted how the individuality of the process of system-building means we cannot outsource this process to Large Language Models. "We must walk this winding and difficult road ourselves if we wish to be the sort of intellectual agents capable of reacting appropriately to the modern world, and that's going to be the key to employability," he added.
Adding to Prof Qu's point, Singapore University of Social Sciences (SUSS) Provost and Professor of Literature Robbie Goh argued that amid an onslaught of AI-generated content, the humanities must hold on to what cannot be reduced to data alone: "What we need to retain desperately…is precisely that genius of individual interpretation."
In the cultural sphere, Dr Seng Yu Jin, Senior Curator and Director (Curatorial, Research & Exhibitions) at the National Gallery Singapore, argued that art matters even more amid AI-driven "image wars" and saturation because it cultivates sensibility and an attention to nuance and meaning. He highlighted "slowness" as a form of agency in a world optimised for speed and efficiency.
Trust, human behaviour and social context will decide AI's real outcomes
Professor Yow Wei Quin, a psychologist from the Singapore University of Technology and Design, shared findings from studies she conducted that involved older adults using conversational AI chatbots. While some found the system comforting, others saw it as unsettling or risky, raising concerns about privacy, trust, and how it seemed "too human". "No matter how intelligent the system is, the outcome is really shaped by human factors," she said.
She called for a shift in what "successful AI" should mean - away from maximising adoption, and towards building tools that help users calibrate reliance through "appropriate distrust", so people learn when and when not to rely on AI.
Offering his view from the social sector, Mr Martin Tan, CEO of The Majurity Trust, warned that the private sector's rapid pace in developing AI tools could turn the human work of social services into a commodity that can only be accessed through technology.
"The worry is that the private sector is moving so fast that the social work sector is not catching up," he said. If this trend continues, the social services sector could see its domain expertise in counselling and mental health being ceded to the private sector as products that are created and packaged far faster than social service workers can deliver.
For speakers, that was a concrete reminder of why the humanities and social sciences remain essential. Their role is not only to understand behaviour and trust, but to safeguard how care, equity and accountability are organised as AI enters everyday life.
Strengthening the impact of the humanities and social sciences in society amid the AI deluge
The event closed with a consensus that the humanities and social sciences are already central to how societies navigate AI, and the challenge now is to make that contribution more visible and felt beyond campus.
Professor Kenneth Benoit, Dean of the School of Social Sciences at Singapore Management University and Chair of the 2026 Ideas Festival, urged humanities and social sciences academics to look "not just to scholarly impact, but also to societal impact," and to ask whether their work is reaching "policymakers, industry, communities, the broader public."
Prof Wee was more specific. He expressed hope that humanities and social sciences academics across Singapore's universities will collaborate on an implementable AI-literacy curriculum that begins earlier in students' learning journeys, calling on all "to put in a curriculum about navigating…and engaging with AI, and the various challenges…but made understandable to much younger individuals."
He added that it is a step that would resonate with universities in other parts of the world as they grapple with the same questions about how to prepare the next generation for life with AI.