Project completion - looking at what we've done
We’ve now reached the end of our project, so it’s time to look back on the work we did and the things we learned along the way. This includes the project deliverables produced as part of the project funding requirements.
What we’ve done
At the start of this process we submitted an application for a discovery project that aimed to deliver;
- a methodology for evaluating the suitability or otherwise of a particular application for developing into a chatbot or AI product
- a research base to assist local authorities developing their individual business cases to save time and resource rather than duplicating research
- a summary of the potential technology solutions that are available with their individual advantages and disadvantages of each
- a set of case studies drawn from participating councils
We’ve now reached the end of the project, and all these deliverables are now available below or via our project resources page
But this project has been more than just producing a set of reports;
- In Oxford, we’ve used the Digital Outcomes and Specialists Framework for procuring a digital agency for the first time
- We’ve trained a batch of staff in the participating councils in the basics of user research and how to present their findings as user experience maps
- We’ve had the councils undertake user research interviews for the first time, and given them a set of tools to become more self-sufficient in this area
- We’ve worked collaboratively to support each other and share the outcomes of our research
- We’ve worked in the open and made all our materials available for anyone else in the sector to use
Project deliverables
- Project Summary Report (Torchbox, April 2019)
- User Research Summary Report (Torchbox, April 2019)
- Case Studies (Torchbox, April 2019)
- ROI Analysis and Market Summary (Torchbox, April 2019)
- Technology Landscape Review (Torchbox, April 2019
- Example Conversational AI Architecture (Torchbox, April 2019)
- Chatbot Feature Comparison Matrix (Torchbox, April 2019)
What we have learned
The following points are distilled from the main project reports, which provide far greater detail than can be provided in summary form;
Collaboration
- Collaboration is difficult to do, but pays dividends if you get it right; it involves how you work together as much as what you are working on together
- Working collaboratively on discovery allows you to pool a wider sample of user research participants and gain insight into any regional variations
- The initial one-year investment of building a chatbot shared by 20 councils would be £779,149. The estimated savings across all 20 councils would be £2.2m (nearly four times as much as it would be if councils developed their own chatbot in isolation)
- Splitting this investment across the 20 councils means that the per-council cost is only £38,957 per year, compared to £111,307 if developed individually.
User research
- Any conversation with a council is nearly always part of a longer journey. As a result, when considering chatbots, it is vital to model requirements in terms of user journeys, rather than simply in terms of technical specifications.
- A service that exhibits a high number of complex enquiries (for example, by being emotional theme, complex subject area, a topic prone to subjectivity, or a matter of contention or debate) is not considered good territory for a chatbot. These complex, human-drive enquiries are better handled by a person
- There are simple things we can do to improve the user journey before using chatbots, such as web content improvement, better search optimisation and easier links to getting more help
Recommended option from research
- Of the four services we explored in depth (Planning, Waste & Recycling, Revenues & Benefits, Highways) the most likely option for an alpha project to take the research forward was Waste & Recycling. This was because;
- it deals with a high level of simple information or service requests that have a high rate of first line resolution
- these fall into relatively few distinct reasons for calling (80% of calls can be categorised into three specific reasons for contact) which means a chatbot can address a large proportion of calls
- it involves a largely non-emotional user journey and an expressed user need for self-service
Technology approach
As well as exploring the current chatbot/AI market and what the most suitable platform approach would be for, the project has explored the potential to develop a single, centralised approach. The two specific reports on this subject found the following;
- All the major cloud and open source providers have adopted a very similar set of technologies for their conversational AI platforms, meaning they can all be trained from a very similar data model.
- The primary challenge in developing a shared chatbot platform for local government is not a technical one, it is an organisational one. Councils must agree to co-operate and then collaborate in building an overall model for a research area that can adapt to the variety of ways of handling the chosen area, as well as adapt to the variety of terms used across England for that area
- Whilst an entirely open-source version of a centralised platform is attractive, it may not be the cheaper option, since the cost to establish this system at scale for multiple councils would be significant. Utilising an existing cloud-based proprietary platform built for this sort of scenario could be cheaper and provide a higher quality solution.
- The software costs-per-serve are generally a small portion of the overall cost. The costs of developing the initial model, the investment in ongoing maintenance and training, and integrations to a wide variety of differing back-end systems across each council, are likely to represent the greatest portion of project cost.
- We must carefully select use-cases where the information and backend infrastructure sending to and receiving data from the conversational AI system are sufficient to create a satisfactory outcome for the user. For these reasons, use-case selection and user research is extremely important in creating a data model for the system to use.
- Although differences in features between the best platforms are small, the overall recommendation from this project for a conversational AI platform would be;
- IBM Watson Assistant Plus or Premium for a public cloud hosted system; or
- Rasa Stack for an open source privately cloud hosted system
Next steps
Sharing our learning
We’re hoping to organise an event to share what we learned on the project with as many councils as possible to meet the spirit of ‘giving something back’ to the sector. We’ll tweet about this and post via the LocalGov Digital Slack team when more details are available
Alpha project
This project has been one of the 16 exemplar projects funded in the first round of the Local Digital Fund, 10 of which were discovery. At this stage, no announcement has been made for funding in 2019/20, or what the balance might be between discovery and alpha projects.
Should it be possible to bid for an alpha project we would need to identify councils that would be interested in collaborating on an application, and on working on the project.
It would be great to hear from councils that would be interested in this sort of project. Why not get in touch and let us know?