Preparing for chatbots by engaging key internal stakeholders

How to coordinate the development and delivery of a chatbot in the tertiary education sector.

Overview of key stakeholder groups

Even when using “off the shelf” chatbot solutions, developing, implementing, and maintaining a chatbot, involves the following strands of work:

  • Content - The knowledge base that the chatbot draws upon (which in many cases is question-and-answer sets) will need to be developed and/or curated. It may also need to be updated periodically.
  • Technical integration - The chatbot will need to be accessible via at least one platform (e.g. an institution’s website, or a student portal); and the chatbot may also need to interact with sources of data (e.g. an information management system).
  • Promotion - Intended users will need to know that the chatbot has been implemented and how it can help them.
  • User engagement - Engaging users can add value across each of the above strands of work.

Within each of these strands a number of key stakeholder groups will need to be engaged.

Stakeholders to engage for content development:

  • A core team responsible for creating/curating content and updating it periodically
  • Domain experts who should ensure the chatbot’s knowledge base (e.g. its question-and-answer set) is accurate

Stakeholders to engage for technical development:

  • Team(s)/individual(s) responsible for ensuring the chatbot can be accessed by users, and that it is integrated with the necessary back-end systems (e.g. information management systems)

Stakeholders to engage for chatbot promotion:

  • Team(s)/individual(s) responsible for promoting the chatbot to its intended users

Users/beneficiaries to engage with:

  • The intended users of the chatbot
  • The groups who may not use the chatbot but who are intended to benefit from the implementation of it (for example an admin team whose workload could be reduced by the chatbot)

Engaging content development teams

In the context of chatbots, the term ‘knowledge base’ is used to refer to the information the chatbot can draw upon in order to answer student questions.

With some types of chatbots, content developers will need to build this knowledge base by creating and uploading a structured question-and-answer set.

In other cases, the chatbot design will allow it to build its knowledge base by drawing information from key sources (for example a college’s website). Where this is the case, the role of content developers is focused on selecting appropriate sources of information, and editing the resultant knowledge base.

In either case, there will need to be a core team in place who will create and/or curate the knowledge base. Input may also be needed from domain experts to ensure the knowledge base is accurate and representative of the information users may want to be given.

Content development - core team

Creating/curating the chatbot’s knowledge base is, in many cases, one of the most significant tasks involved in the chatbot project. As such, a key decision to be made is how this core content development team should be comprised. Capacity, priorities and institutional structure will be central factors in this decision, and so there is no clear prescription for optimal team make-up, but the following suggestions are intended to help with the process of selection. An initial consideration is whether the core team should be a subset of an existing team, or whether it should include a cross-section of individuals from other relevant teams.

Subset of existing team

Advantages: Housing the core development team inside an existing team can make the planning and management of work simpler

Disadvantages: In many institutions, and for many chatbot use cases, there may not be one team within an institution who has all the relevant knowledge needed to create/curate accurate and useful content

Considerations: If the core content development team is housed within an existing team, this team may need to draw upon other domain experts so that the content they create/curate is accurate and useful to users

Cross-sectional team

Advantages: Cross-sectional teams bring wide ranging knowledge that may be useful to the chatbot’s development

Disadvantages: The process of planning and managing work can be more complex

Considerations: Consider how large a cross-sectional team would need to be in order to include enough domain expertise. The larger the number, the less likely this model is to be successful.

If a cross-sectional team structure is decided upon, the next task is to select the appropriate individuals to join the team. Think about the topics the chatbot needs to have knowledge about (e.g. pastoral care, enrollment, careers, finance) and ensure the team includes individuals who have the relevant knowledge. If it is decided that the core content development team will be a subset of an existing team, the next question is which team is most appropriate. In addition to the factors of capacity, priorities and institutional structure, a further consideration here is that of skill sets.

The teams needs to:

  • Understand the objectives of the chatbot and the intended benefits for users and other stakeholders
  • Have the ability to become proficient at using the chatbot platform
  • Be able to work with colleagues throughout an institution in order to source information that may be used to build the chatbot’s knowledge base
  • Be able to analyse data on the chatbot’s performance in order to make decisions on how/whether to update the chatbot

Numerous existing teams within an institution may meet these criteria, including:

  • Student services teams
  • Education technology teams
  • Operations teams
  • IT teams

Think about whether the chatbot’s use case indicates an appropriate team to sit within. For instance, if the purpose of the chatbot is to support student admissions, then an effective option may be to base the core development team within the team responsible for student admissions.

Another point to consider is that in cases where significant work is anticipated in order to integrate the chatbot with an institution’s existing systems, then the content development team may need to work at a similar rate to the relevant technical teams in order to avoid bottlenecks from either party. In such cases, it may be practical to select a core development team that sits within, or works closely alongside, the relevant IT team, to support synchronisation of workstreams within the chatbot project.

Content development - additional domain experts

It may be the case that the core content development team does not have exhaustive knowledge of all the topics that should be included in the chatbot’s question-and-answer set.

Consider an education technology team tasked with creating and curating the question-and-answer set of a chatbot designed to address general student queries. It is anticipated that the chatbot will need to address queries on topics including: admissions, amenities, exams, graduation, careers, and pastoral support.

To ensure the chatbot’s question-and-answer set anticipates the types of questions that will be asked and includes useful, informative answers, the education technology team will need to work with colleagues from a range of other departments to source relevant insights.

As part of the process of preparing for chatbots, it is suggested that additional domain experts are identified and engaged early, so that there can be clear lines of communication between the core content development team and additional domain experts.

One suggestion on how to do this is to appoint ambassadors from different departments, who will be tasked with liaising with the core content development team, to ensure the chatbot’s knowledge base is strong in their particular area of expertise. For instance, the education technology team may work with an ambassador in the pastoral team, who will advise on useful answers to the types of questions that students may ask around pastoral support.

Delegating content creation work

Another key decision to make in terms of how the core development team and the additional experts work together, is around delegation. Developing a high-quality knowledge base for a chatbot is a skill and as such it will often be advisable to give one properly trained team editorial responsibility for structuring this knowledge base.

This principle indicates the approach of using additional domain experts as advisers/consultants, who provide information about the types of questions that need to be addressed and how to answer them, but are not directly involved in inputting content into the chatbot’s question-and-answer set.

That said, if the additional domain experts are given training on how to structure question-and-answer sets, it could also be sensible for these individuals to take on the responsibility for directly inputting content into the question-and-answer set.

Content development – maintenance and ongoing development

The work involved in curating the chatbot’s content does not end once the chatbot has been launched. In developing the first live version of the chatbot’s knowledge base, assumptions will have been made about the kinds of questions users will ask, and the information they will find useful in response. When the chatbot is put into action, an opportunity is created to review these assumptions and to improve the chatbot’s knowledge base so that it more accurately reflects users' needs. Also, while driving the chatbot’s maintenance and ongoing development may not be a primary concern during the preparation process, considering the following questions at an early stage will be beneficial in the long run.

The first question to ask is:

“What data should be collected in order to evaluate the effectiveness of the content?” 

The ultimate judge of the chatbot’s effectiveness is direct feedback from the user, or, where this is not available, proxy feedback on the chatbot’s performance that enables inferences to be made about user experience. As an example of direct user feedback, some chatbots are designed so that every time the chatbot gives a response, a user is asked whether the response was useful or not. Overall chatbot performance can then be judged on metrics such as the proportion of answers that were stated to be useful.

An example of proxy feedback would be a complete log of all the questions the chatbot asked along with the answers the chatbot gave. From line-by-line analysis, the proportion of questions that were answered directly can be established, which gives an indication of how well the chatbot is performing.

Whilst the quality of the content is not the only factor that contributes to the chatbot’s performance, it is a significant one. The quality of the chatbot’s knowledge base influences its likelihood of correctly identifying the meaning behind a user’s query. For instance, for a chatbot to effectively answer student questions about graduation, those developing the chatbot will need to have anticipated both the questions students will ask about graduation and the ways in which these questions will be asked. Furthermore, they will also need to input informative and easy-to-understand answers to these questions.

A further question to ask is:

“What should the workflow be for ongoing development?” 

Even established chatbots will not accurately answer user questions 100% of the time. Some users will ask questions that would have been difficult to anticipate. Others will ask questions in a way that even a highly effective chatbot would not be able to parse. Meanwhile, there will also be cases where user inputs are not necessarily intended to get a sensible answer (facetious comments, for instance). This means that it is appropriate to think of ongoing development as making the chatbot better, not perfecting it.

The key phases involved in ongoing development are:

  1. Decide on priorities for improving the chatbot/addressing any shortcomings. This can be done by analysing data to identify in what circumstances the chatbot is not providing correct/useful information to users. Alternatively, it can be done by identifying new types of information the chatbot could give to students (i.e. for a chatbot focusing on amenities, new information about amenities in the local area may be added to the knowledge base).
  2. Decide on capacity and priorities. It may be that the capacity apportioned for ongoing development is fixed, and so the focus here is on prioritisation. Alternatively, it may be that capacity is allocated based on the rate at which improvements need to be made.
  3. Update the content as required.
  4. Evaluate the impact of the changes based on data. If your changes have been effective, you should expect to see improvements in the accuracy rate of the chatbot over time, using the data used to make initial changes. The evaluation will also highlight further areas for development.
  5. Repeat.

Engaging technical teams

The chatbot may need to be integrated into existing systems in two ways:

  • Front-end integration refers to the processing of making the chatbot accessible via one or more platforms (for example an institution’s website, a student app, a virtual learning environment).
  • Back-end integration refers to the process of connecting the chatbot with one or more sources of data (for example an information management system) so that the chatbot can draw upon data when providing responses, or so that the chatbot can feed data from user responses back into the relevant system.

The same teams/individuals may be responsible for both types of integration; however, this will not always be the case.

For front-end integration, the key question to consider is:

"On which platforms should the chatbot be placed?"

Consider which platforms your target users are most likely to use. For back-end integration, the central question to consider is whether integration with data sources is needed for your purposes.

Chatbots can function effectively without the capacity to draw in data from other sources. For instance, the chatbot used as part of the Jisc chatbot pilot can answer frequently-asked-questions about college life without needing to provide personalised answers to students (which could be done by drawing from college information management systems).

If a student asks the Jisc chatbot pilot “when is my next lesson?”, the chatbot will respond with a link to the student timetable, and the student can access their own information by inputting their username and password. If the chatbot were integrated with college’s information management systems (or wherever data on timetables is stored), the chatbot may be able to provide the student’s individualised timetable directly.

When deciding whether back-end integration is needed, consider the following:

  • Will the ability to provide personalised responses to students add significant value?
  • Will the process of back-end integrations (including securing sign off) be worth the effort?

Engaging promotional teams

For the chatbot to serve its purpose, users will need to know that the chatbot is available to them, where it can be accessed, and how it can be of benefit. In other words, the chatbot needs to be promoted effectively. Those who are coordinating the chatbot project should work with relevant specialists within the institution to establish:

  • The most suitable media for promotion
  • The tone and style of messaging

One thing to be mindful of during promotion is that users should gain appropriate expectations of what the chatbot can and can’t do. If users are led to believe the chatbot will be an all-purpose digital companion, they may be disappointed in its performance and thus feel less inclined to benefit from its limited yet purposeful functionality.

Engaging users

Users should be engaged whilst preparing for chatbots, as their input can support many of the decisions discussed previously. Questions that you may want to soundboard with a sample of intended users include:

  • Technical integration - Via which platforms would you most want to use the chatbot?
  • Content - What kinds of questions would you want the chatbot to be able to answer/what information would you expect in the response?

Engaging users early can also be helpful if you want to set up testing of the chatbot before its launched. Furthermore, it enables users to air any concerns they may have around the chatbot and any relevant preferences they may have. Users may have misgivings about chatbots where the interface reinforces negative stereotypes around particular groups in society. When conducting forums with students prior to the launch of the Jisc chatbot pilot, for instance, it was found that students were generally against gendered chatbots, and chatbots that presented themselves as being human. That said, there was reasonable consensus that chatbots should express some degree of personality in how they appear and how they use language. 

User concerns and preferences will vary depending on context. As such, developing lines of communication with user groups and listening to user perspectives can be of great value.

About the national centre for AI in tertiary education

Jisc’s national centre for AI was established to accelerate the adoption of effective and ethical artificial intelligence solutions within colleges and universities. As part of working towards this goal, the centre has conducted a pilot in which chatbots have been deployed in four UK further education colleges.

Conducting this pilot has enabled the centre to develop expertise in educational chatbots, which has, in turn, motivated the publication of a series of guidance documents on the subject of chatbots. This guide is the second in that series. The first is an introduction to chatbots, which is intended for those who want to learn the basics of how chatbots function and how they can be used in education.

For more information about chatbots, read our guide: Developing high-quality question-and-answer sets for chatbots.

This guide is made available under Creative Commons License (CC BY-NC-ND).