How can we embed ethics, trust and transparency in social robots?

Today, the All-Party Parliamentary Group on Data Analytics (APGDA) held a roundtable in partnership with RoboTips. The event discussed the implementation of the ethical black box (EBB) and the need for this technology within social robots.

In the current wave of technological evolution, the development of social robots and artificial intelligence continues to progress. However, such developments have often been accompanied by concerns regarding the systems’ potential to cause harm to humans. 

While the potential impacts of AI are recognised as far-reaching, they also require trust from society. If the public is not assured that human-technology interactions are ethical and will bring about positive change, such interactions risk being widely rejected, and the benefits wasted. 

This roundtable was chaired by APGDA Co-Chair Lord Holmes of Richmond.

Speakers included:

  1. Lord Tim Clement-Jones, Co-Chair of the All-Party Parliamentary Group on Artificial Intelligence
  2. Professor Marina JirotkaProfessor of Human Centred Computing at the University of Oxford, RoboTips
  3. Professor Alan WinfieldProfessor of Robot Ethics at the University of the West of England, Bristol, RoboTips

After presentations from the panellists, we entered a general discussion with general themes emerging around transparency, the policy road map and what first steps government should take. 

Developing citizens trust is also a key focus in the work of the APGDA, as highlighted in recent research, Trust, Transparency and Tech and Our Place, Our Data.

This roundtable was live-tweeted from @DataAPG. Make sure to follow us on Twitter to not miss out on upcoming events.