Projects

MPC Network members have a range of interdisciplinary research projects underway, many involving a great deal of public engagement.

Ongoing is our work on contemporary fake news, disinformation and political impacts. Fake news is a complex phenomenon, driven by technology, economics, propaganda, and people's woeful digital literacy. There are many sociological and psychological reasons for why people spread false information. As such, there is no one single solution to fake news. MPC members are working on this. We have written and spoken extensively about these issues for academic journals and subject associations, for the UK Parliament's Fake News Inquiry, and to international journalists and fact-checkers

Also ongoing is our work on emotional Artificial Intelligence ('emotional AI'). Emotional AI encompasses machines that are able to use AI techniques to sense and 'feel-into' human emotional life. Input features might be facial expressions, voice samples and biofeedback data, and the output features are likely to be classified emotional states, which in turn are used to make judgments and predictions about human behaviour. Just think of how this might apply to media content, digital agents, devices and things we encounter in our environments! Is making human emotional life machine-readable acceptable? If so, on what terms? If not, precisely why? This set of related projects is generating books, papers, industry-relevant reports, policy analysis and practical solutions on how to address emergent technological interest in emotional life.

We have various projects on data transparency and governance. We examine digital political campaigning and problems with lack of transparency, submitting reports to bodies such as the Parliament of Victoria (Australia) Electoral Matters Committee, and the UK House of Lords Select Committee on Democracy and Digital Technologies. We have run multi-disciplinary projects examining data transparency in relation to privacy, security, sur/sous/veillance and trust; and artistic projects on data transparency and mutual watching. We also are working closely with personal data storage app developer, Cufflink, (a world-facing company based at Bangor University’s M-SParc Science Park) to ensure that ethics are built into the app’s design.  

Our work also examines persuasive representations and risk communication. For instance, arising from our Risk Communication sub-group and funded PhD student work, we have made a short documentary for policy-makers and funders to explain the new technology of carbon capture.