The European Commission has published a Policy Report (Report) on AI and digital transformation lessons from Covid-19. The Report notes that “the Covid-19 pandemic has caused something akin to a natural experiment. It has exposed us to unforeseen and unprecedented conditions, forcing us to react in ways unimaginable just six months ago.”
The Report’s assessment of the results of this experiment: the pandemic has exposed AI as a double-edged sword, both as a saviour and a hindrance.
The pandemic has boosted AI adoption and data sharing to the benefit of public health:
- Healthcare: Prior to the pandemic, AI technology was primarily used in operational, financial and administrative functions such as AI software to facilitate patient management. Some innovative uses of AI as a result of Covid-19 include the detection of contaminated surfaces to reduce the risk of contagion. Robots are also increasingly used when disinfecting areas with toxic chemicals, to monitor social distancing, to take temperatures, manage patient triage and deliver medication in virus contaminated environments.
A criticism of AI technology in healthcare is the limited and fragmented sample size with potential racial and gender data biases when training AI algorithms. France has created a Health Data Hub to overcome fragmented datasets. It collates anonymised data from medical prescriptions, financial data from hospitals, disabilities information and samples of private insurance reimbursement data. Finland has mandated the sharing of health data with a centralised Government data pool.
- Research and vaccine discovery: The European Commission has also set up a Covid-19 Data Portal enabling researchers to access, upload and analyse Covid-19 related data. The Open Covid Pledge has made available voluntary data sharing to accelerate vaccine discoveries by temporarily allowing free access to patents and copyrights.
- Synthetic population modelling: AI technology is also useful in creating synthetic populations to manage re-opening economies. The Report models how publicly available data can be used to build a model which includes travel to work behaviour, the estimated number of daily contacts based on household size, social behaviour by different industry sectors (e.g. likely close contacts on a factory shop floor vs in an office environment) and the socio-economic characteristics of different regions. The tool also can be run in reverse to estimate the costs of re-closing to deal with localised ‘hot spots’.
One increasingly apparent tension is reconciling the beneficial uses of AI with protecting privacy and unnecessary mass surveillance. ‘Surveillance capitalism’ is an emerging norm during a public health emergency. For example, in the Italian city of Varese, they are experimenting with placing electronic bracelets on children so they vibrate if children breach social distancing.
Individual rights can be derogated from in an emergency, such as permitted by the European Convention on Human Rights. Modelling can be used to justify restrictions on individual freedoms where private and social costs can be equalised – i.e. private social welfare can be minimised provided there is adequate compensation. The Report quotes a study that found the private cost of a Covid-19 infection is estimated at $80k while the social cost is around $286k. The Report concludes that “this wide gap provides a stark illustration of the social desirability to reduce freedom in private behavioural choices, through social distancing and confinement measures, to avoid high social costs”.
The problem is that, when the emergency passes, the derogation from individual rights is often not unwound. The Report identifies a rare example: in July 2020, Norway deleted all data collected by its contact tracing app when the levels of community transmission had dropped to low levels.
The Report also says that Covid-19 showed the power of Big Tech, but the outcomes were not necessarily bad for consumers. EU governments initially pursued contact tracing technologies which stored personal data on central servers. The alliance of Google and Apple for a decentralised solution that instead stores all the data on mobile phones increased privacy for the users. Most EU governments were forced to change their plans and follow this approach. The Report notes the flexing of Big Tech’s muscles to enhance privacy was “ironic given past events, such as the Cambridge Analytica scandal”.
Digital divide becomes a health divide
The Report highlights the difficulties of effectively managing a health crisis with digital tools when there are stark economic and social inequalities. 27% of EU citizens perceive themselves as lacking digital skills, do not own a smartphone or are unwilling to use contact tracing apps. In a ‘get ready for the next big one’ message, the Report urges governments to take active steps to not leave behind vulnerable groups in the digital health revolution.