LISTEN: co-designing and evaluating personalised self-management support for long Covid
EmailListen is jointly led by Bridges Founder, Professor Fiona Jones of Kingston University and St George’s, University of London, and Professor Monica Busse of Cardiff University. The Listen project aims to work in partnership with individuals living with long Covid and to co-design, co-deliver, and evaluate a personalised self-management support intervention with a specific focus on non-hospitalised people living with long Covid. Bridges will be playing a key role in the co-design stages as well as delivering training to all the participating rehabilitation teams. As ever we will be co-delivering the sessions with people living with long Covid. For the first time we are working in partnership with our colleagues at Diversity and Ability to help ensure the co-design and evaluation is inclusive for people from across all communities. As the project reaches key milestones, findings and developments will be shared widely and feedback invited from the broader ‘long Covid’ community. In the meantime, all the latest project news will be posted on the LISTEN twitter page @TheLISTENproj Please see the link to our blog below for more about the Listen project https://www.bridgesselfmanagement.org.uk/high-time-to-listen/
- Adults
- Long term conditions
- Complex
- Other [please specify]
- Home-based
You and me: Artificial Intelligence (AI) for improve communication
EmailFunded by the The Health Foundation
Run by Heart n Soul, in partnership with University of the Arts London’s Creative Computing Institute and the Royal Borough of Greenwich.
Aimed to demonstrate how artificial intelligence (AI) could improve communication between people with learning disabilities and autistic people, and health and social care professionals – an area identified as a priority for change by people with learning disabilities and autistic people.
Involved the development and testing of an AI app that uses multimodal machine learning to support understanding.
Ran from November 2023 to March 2024.
Outcomes
By integrating audio, video, images, speech recognition and generative AI, the team created a multimodal analysis and communication tool. They discovered that it is possible for people with learning disabilities and autistic people to develop their own AI systems.
The exploratory app is able to simplify complex letters, change terminology so that the majority of people can understand it, describe pictures, build up an understanding of people’s preferences, and be used by people with complex needs, including physical needs, using a simple push button system.
The project also showed how communication is an effective tool to improve relationships and enable culture change; by making simple accessible communication the norm and complicated jargon-layered communication the exception, connection and relationships are better for everyone.
- Adults
- Children
- Minority groups
- Work force
- Cognitive disability
- Assistive equipment
- Technology
- Charity