The third in our series of Bettakultcha events returned to Leeds last week, and this time we were talking all things data. From the public perception of data and the way it’s used, to the impact it is having on shaping our future world; five speakers took on the unique short and snappy presentation format – 20 slides, 15 seconds per slide – to share their views on the topic.
First up was Eleanor Snare, consultant and coach. As a performer, poet and artist, Eleanor discussed a less traditional perception of data, highlighting one particular data source that we all use, every minute of every day – our feelings and emotions. Rather than relying on statistics and digital data, Eleanor suggested that our gut instinct should guide us, and we should have the confidence and trust in our own feelings to make well-informed decisions in life. Eleanor summarised by saying that there is validity in all different types of data. Just because the data doesn’t have a number attached to it, doesn’t mean that it’s not a valid data source.
Graham Spearing, Product Lead for Data Services Alliance at NHS Digital, followed with an insight into how the NHS uses our data and how this is perceived by the public. Despite a general lack of trust when it comes to institutional data, it seems the NHS has a more positive reputation when it comes to the use of public data. Attributing this perception to the nature of the sector, Graham showcased the power that this personal data has. Not only can it help plan out clinical pathways, it is also used for research to improve care.
Highlighting the often-controversial perception of data and machine learning, Trevor Hardcastle, Chief Data Scientist at Vet AI, challenged us to ignore the media’s scaremongering around AI’s takeover of the human race. He suggested we all have a more optimistic view on how data could shape our future – from predicting illnesses to diagnosing symptoms. After all, AI technology is more likely to save your life than pose a threat to it. When data is used in the right way, for the right reasons, the future is bright and we should embrace these opportunities. It’s the dangerous potential for humans to use it for unethical purposes that we should be concerned about.
Next up was Graeme Tiffany, community philosopher, who focused on the people and organisations harvesting our personal data without permission, for seemingly good reasons. For example, facial recognition is currently being used in China to prevent crime. But how far will these uses extend? And what happens when the data is wrong? Graeme highlighted an alarming report, which showed that a fifth of this data is returning false positives. If we can’t trust the data itself to be accurate, how can we trust the way it’s being used?
Finally, Tom Forth, Head of Data at ODI Leeds, suggested that rather than trusting or distrusting data, we should think less and experiment more. His work sees him continually collecting and sharing data to help us better understand society. So when we’re asked ‘can we trust how data is used’, lets look at what experiments have been done to answer that question, rather than thinking and worrying about it. Let the experiments lead our thinking, rather than the other way round. Tom used the example of Amazon using AI technology for recruitment. This ‘experiment’ highlighted the gender bias towards males in their recruitment processes. The media portrayed this as a huge fail for Amazon, shining a spotlight on archaic recruitment preferences, but what the ‘experiment’ actually delivered was a set of data that enabled Amazon to make positive changes to improve equality in their hiring processes.
In conclusion, data has immense power to shape the future, from protecting our health and preventing crime, to making society more efficient and well informed. However, just as there is great opportunity for good, there is also the dangerous potential that, in the wrong hands, data could be used against us. We must experiment and verify our data, sharing it in an open and transparent way, if we are to improve public trust around its use.