Entitled “Data: a new hope”, Data Summit 2022 had a decidedly ambitious tone – informed in large part by the formidable presence of space-related speeches and signs that encouraged global thinking.
This is not to say that the more sensitive affected individuals have not been addressed. A major theme more ingrained in the event was around data bias, diversity and inclusion in the sector.
Data Lab CEO Brian Hills noted two main areas of “hope” for Scotland’s tech sector.
More transformational projects must be the goal, Hills noted, while focusing on inclusion and diversity.
How can data help the most vulnerable in society?
Linking values to data was the key takeaway from a panel focused on using data to help vulnerable people.
With a mix of data scientists from the public and social sectors, the panel on using data to improve outcomes for vulnerable people addressed the importance of an open-minded approach.
Starting conversations with values first to drive the appropriate use of data to inform decision-making was a general theme.
Giselle Cory of DataKind, a charity that helps other charities use data to make informed decisions, noted how data can help manage decisions even in the most tumultuous socio-political climates.
Reflecting on the pandemic, Giselle found that charities that weren’t using data were “struggling” as the world changed dramatically for them and their target demographics.
Responsible data sharing was also a theme of the conference, and of the day in general.
Forming a culture of trust was a hope of Alex Fassio, who worked for the Department of Justice, using data to inform policy and track change.
The panel explained how the powerful data resources of the private sector – represented by organizations that are very willing to help the third sector – are not being used properly because there is no good framework in place where companies private sector can “share ideas”. without risking sharing data or breaching data protection,” Fassio said.
How essential is data to the vision of the Scottish Government?
Tom Arthur MSP, Minister for Public Finance, Planning and Community Wealth, the special guest and Scottish Government representative for the day, said the government is doing more to gain people’s trust to use data, and must do so in an ethical manner.
Unable to attend in person this year, First Minister Nicola Sturgeon weighed in again on the subject of where data fits in Scotland.
She said: “Data is at the heart of the Scottish Government’s vision for our economic future. It is an area where Scotland already excels, where we have the ambition to do better and where there is potential for significant economic opportunity and significant social benefits.
“Data is absolutely essential to addressing social and economic challenges, creating jobs and wealth, and improving people’s health and well-being. The talent, innovation and energy on display at an event like the Data Summit demonstrates that the data sector here in Scotland is healthy.
Does the data have an aura of undeserved objectivity?
In the final keynote of the day, keynote speaker Helen Fry, professor of mathematics, delved into data bias, a major underlying theme throughout the day.
Fry challenged the audience, using different case studies to question the formation of data and algorithms when both are often built from a foundation of bias in their creation.
If you Google “math teacher,” for example, you’ll find that only two out of 20 photos in the image results are of women.
94% of maths teachers in the UK are male, so it follows that this is a pretty accurate representation of that particular professional landscape, right?
Not really, according to Fry. As she says, “Sometimes we don’t want technology to be a mirror of the world we live in, it can be more ambitious. »
Most bias, Fry said, results from “thoughtless omission” but resulted in a message of “You don’t belong here,” especially to those who weren’t involved in making the technology or databases.
Fry provided an array of examples, from Nikon’s camera AI detecting the smiling eyes of an Easter Asian person as a “wink,” automatic soap dispensers not recognizing not dark skinned and leaving agencies with no gender selection options.
Importantly, Fry noted, “bias is not just a technical issue.”
Nothing more succinct expressed this than the data she shared about the US healthcare system selecting high-risk patients for care.
The creators of the algorithm attempted to remove race as a proxy in an attempt to achieve fairness. The proxy they used to identify the “sickest” patients was the amount of money already spent on treatment.
By removing the proxy on race, they also removed considerations of racism as an institution. When using a proxy for the number of chronic diseases, it was found that blacks had to be even sicker than whites to access the same amount of health care when spending was used as a proxy.
Bias in the data cannot be corrected with statistical tricks – the bias exists outside the data and will be exacerbated by the data if not handled properly.
Linking previous talking points, Fry lamented that even in education, biases are driving certain demographics away from STEM careers.
When math gets tough, as it does, Fry said, women are less likely to stick with it because they haven’t been nurtured based on their intelligence, whereas men the same ability were already confident based on conversations at an early age focusing on their ability.
Bias in data is “the closest shadow to the deeper issues” the world needs to address, Fry said.
This is by increasing the diversity of people who create databases, algorithms and technologies, while accounting for biases rather than ignoring them in futile attempts to eliminate them.