2 April 2026
By Andrew Morley, Senior Practice Development Consultant, SCIE
On 23 March, I had the pleasure of joining Care Talk Magazine’s roundtable, ‘From Analogue to Digital: Unlocking Technology’s Role in the Future of Care’, chaired by Professor Martin Green OBE, Chief Executive of Care England and Chair of the TEC Services Association. The conversation brought together providers, people with lived experience, technologists and sector leaders, and included representatives from Bupa, Think Local Act Personal (TLAP), Community Integrated Care, the Carers Collective, Autumna, The Care Workers Charity, Person Centred Software and others. It reinforced something that has become increasingly clear through SCIE’s work on Artificial Intelligence (AI) over the last 12 months: technology in care is fundamentally different from technology in any other sector.
If we treat technology as a narrow ‘solution to a problem’, we miss the whole purpose of care and support, which is helping people live the lives they want, in the way they choose. Technology can absolutely help us do that, but only if we approach it with the right values, the right safeguards, and the right people shaping it.
Across the roundtable, there was strong agreement that the value of technology lies in enabling independence, choice and positive risk-taking. As we discussed, there is huge potential for better use of data to help people engage with the risks they want to take, rather than feeling limited by the offers available to them.
This echoes one of the strongest messages from SCIE’s Digital Programme on behalf of the Department of Health and Social Care (DHSC) and Partners in Care and Health. The programme, delivered over the past 12 months, explores the practice, governance and ethical implications of AI use in adult social care.
Throughout the programme, participants repeatedly emphasised that AI must support, not replace, professional judgement — a theme captured clearly in SCIE’s upcoming Digital Programme report ‘AI Decision Making in Adult Social Care’, which will be published in April: “AI may alert us to patterns or risks, but it cannot determine what those signals mean without human oversight.”
The roundtable also highlighted how data is still too often siloed across agencies in ways that don’t reflect how people experience their own lives, as an indivisible whole. Bringing information together offers the possibility of understanding people as unique individuals and supporting more effective integration and partnership working. But doing so requires confronting some significant ethical and technical hurdles.
One of the most powerful themes from both the roundtable and our recent programme is the need for early, meaningful co-production. People with lived experience told us repeatedly that involvement often happens too late or becomes symbolic when they feel they need technical expertise to contribute – issues also highlighted in SCIE’s 2025 report ‘Embracing change: scaling innovation in social care in practice’, sharing invaluable learnings from our work supporting the DHSC’s Accelerating Reform Fund.
As one participant in SCIE’s AI Co-production working group put it: “Co-production should shape purpose, assumptions, and values… not just comment on finished products.” As part of SCIE’s Digital Programme, this group helped produce an AI Co-production Charter and Toolkit — a practical resource that local authorities can use to ensure that people’s voices shape AI design, deployment and monitoring from the outset. This emphasis on values-led design was also reflected in our recent work with Bromley Council, where SCIE provided an independent review of their AI transcription pilot before it was scaled up. Bromley recognised that successful digital transformation required more than technology alone; it needed a strong focus on culture change, and they delivered this alongside a linked project to develop their co-production strategy. This mirrors the wider lesson that ethical, inclusive innovation depends on organisational readiness as much as technical capability.
Equity and bias were also major concerns. SCIE’s ‘AI Decision Making in Adult Social Care’ report highlights that bias is embedded not just in algorithms, but in data, design choices and organisational systems. A participant of SCIE’s Digital Programme described examples of AI misinterpreting everyday actions — like “picking food from fridge” being recorded as “able to prepare meals”. This graphically illustrates how context or cultural nuance can be lost. This was echoed in the clear view that emerged from the Care Talk roundtable: data is rarely neutral. It is influenced by the questions asked, the method of collection, recording, access and interpretation by the person using that data.
These risks underscore why inclusive design, diverse testing and culturally competent procurement are essential.
At the roundtable, we spoke about the need to rebalance what we make visible. Care is full of moments that go right every day, yet many systems are designed to surface what went wrong. Technology should help us recognise and celebrate quality, not just evidence risk.
But trust is the foundation for all of this. SCIE’s Digital Programme revealed that meaningful consent is one of the most challenging areas. Consent is too often treated as a one-off administrative step, yet, as our report notes, “meaningful consent must be informed, ongoing, capacity sensitive, and free from coercion.”
Another strong theme from the roundtable was the need for a more confident, mature relationship with innovation — one that embraces opportunity but is honest about risk. This aligns closely with the findings of SCIE’s Digital Programme. As one participant put it: “We need governance that learns and evolves… and fits within our broader systems of monitoring, reflection and improvement.”
As part of our forthcoming ‘AI Decision Making in Adult Social Care’ report, we set out practical tools local authorities can adopt, including:
These insights also fed into our development of a six-domain evaluation framework (part of the AI Decision Making in Adult Social Care report) to help local authorities assess proposals, supplier pitches and internal ideas in a structured, values-led way.
What struck me at the roundtable was how aligned the sector is becoming. Whether discussing predictive analytics, Technology Enabled Care, workforce tools or data sharing, the same principles kept resurfacing:
These themes are central to SCIE’s work this year and will underpin our new resources being published on the Knowledge Hub in April—the DHSC space for AI and digital guidance for local authorities. Together, they give local authorities a practical, values-driven foundation for exploring and adopting AI safely, ethically and with confidence.
The roundtable reminded me that while technology can help us move faster, what matters most is that we move in the right direction, one that strengthens autonomy, dignity, inclusion and the relationships at the heart of care.
To ensure you receive information about these resources as soon as they are available, please sign up to SCIELine, and if you would like an exploratory chat about how we can help support you with digital transformation to achieve better care while saving costs, contact us.