As you might have read in our previous article, the Techlab team participated in the E.ON and Microsoft Hackathon at the German headquarters of Microsoft in Munich. This previous blogpost is a great read if you would like to know more about our trip and the great time we had. In this article we would like to focus more on the content of our project: what exactly did we achieve? The objective of our project was:
Building a smart skill matrix which will help management and teams to keep track of the available and needed skills on personal and team levels to deliver the work that is coming their way.
Below, you find the architectural overview of our solution.
The platform we used to store our project was GitHub. Furthermore, the chatbot that we created was built in Azure Bot Framework Composer and will eventually be deployed to Azure Bot runtime (for now, to test, we simply use the Emulator within Azure Bot Framework Composer). Lastly, our chatbot will be made available for our colleagues via either Microsoft Teams or Slack. The bot is connected to the database with Azure Functions, which contain queries to save and search the data collected by the bot. The database is built in Cosmos DB. Both the Azure Functions and Cosmos DB are created and set up by code in Visual Studio Code.
Let’s now zoom in on these different modules in the architecture. During the hackathon we mostly focused on the modules indicated by the red squares in the picture. The supporting elements or elements which were not implemented yet, are not further discussed in this article.
Chatbot (Azure Bot Framework Composer)
To start off this hackathon project, we focused on creating the first part of our solution: a chatbot. Our Microsoft experts advised us to build the chatbot in Azure Bot Framework Composer. This tool is a very user-friendly way to create a chatbot quite easily. The hardest part was to get the composer configured, since to get it working, you need a lot of Azure services. Once this worked out, we proceeded building our own chatbot.
The Bot Framework Composer works as follows: you can define different flows and model a chatbot conversation inside this flow based on user input and if/else statements. These flows can be found in the upper left corner in the picture down below. In the flow you can create different commands your chatbot executes. For instance, that the chatbot simply sends a specific sentence, or that the chatbot asks the user to choose an answer for a question. In the last case, you can define follow-up actions based on the answer chosen by the user. For example, to start one of the other flows, or to call upon an Azure Function to connect to the database. These examples can also be found in the image down below. More details about the database and Azure database will be discussed later in this article.
When you want to test the different flows, you can use the Emulator to start a conversation with your chatbot. Logs of all steps happening in the back-end can also be seen in this Emulator while having a conversation with your chatbot, which makes it easy to discover mistakes in the flow while chatting.
To make things a bit more explicit, let’s look at part of our solution in the Bot Framework Composer step by step. The picture above is a screenshot of the Bot Framework Composer. As mentioned earlier, in the upper left corner, different flows defined by us (Greeting, AddSkills, etc.) can be seen. In the middle row, you notice that we selected our SearchSkill flow. In this flow, the user is asking the chatbot to send a list of colleagues who master a certain skill. First, the chatbot asks for which skill the user needs help (Prompt with multiple choice). Then, user input is requested, the user selects one of the multiple-choice options. Based on the choice made by the user, a follow-up action is triggered. In our case, this follow-up action is sending an http request or sending a chatbot response.On the second image down below, an example is shown of what the chatbot conversation looks like in the Emulator. On the left you see that the user selected “Mulesoft” as an answer to the question modeled above in the flow. The chatbot then executes the necessary follow-up actions and responds with all people in the database who master the skill “Mulesoft”. On the right you see the logs of the conversation. For example, you see that the chatbot sent a message “Hi! Welcome! I am Skilly, the ‘smart’ skill matrix…” and that later, a HTTP request was sent.
Graph Database (Cosmos DB)
Due to the nature of the data, Alex suggested it could be interesting to store the data in a non-relational graph database. Such a database stores information in two tables: One with nodes and one with edges. Nodes are entities like people, skills, departments etc. Edges represent the relationship between nodes, for example Person (node) likes (edge) person (node). On both nodes and edges, you can store all kinds of attributes, which is useful to enrich your data with more variables.
To implement the graph database the Cosmos DB service of Microsoft Azure in combination with the Gremlin API was selected. We found it was a very suitable choice to just try out some things and get used to the graph data format. In the Azure portal you can already manually add some nodes and edges to your database and see how these would look in a visualization. Also, the Gremlin API makes it easy to use standard code to create edges and nodes in your database.
To better explain how our database is designed and structured, I will discuss some sample data. In the picture you see a visualization directly in the CosmosDB Azure service. The data behind this visualization contains three node types (Person, Skill and Team) and two edge types (“masters” and “is member of”). The data tables behind this visualization are shown below. You see that each node has an id, which is also visualized in the diagram. Each node then has a type or label, so this describes what kind of entity the node is. Lastly, each node can have properties, which are defined in a list of key-value pairs. The edges table below works quite similar. The difference here is that the id is never really used, the edge can be identified by its source and target node.
|Niels van der Horst||Person||<first_name, Niels>, <last_name, van der Horst>|
|Typescript||Skill||<name, Typescript>, <category, Front-end>|
|NodeJS||Skill||<name, NodeJS>, <category, Front-end>|
|Angular||Skill||<name, Angular>, <category, Front-end>|
|1||is member of||Niels van der Horst||Siterocks|
|2||masters||Niels van der Horst||Typescript||<skill level, 1>|
|3||masters||NodeJS||Siterocks||<skill level, 4>|
|4||masters||Niels van der Horst||Angular||<skill level, 1>|
Connector between chatbot and database (Azure Functions)
To save the answers given to the chatbot into our Cosmos DB, we needed a connector. We found Azure Functions, which is similar to AWS lambdas. The connector works in two ways: Firstly, to load data into our database based on the answers given to the chatbot. And secondly, to query data from the database and retrieve a result. For saving data based on the chatbot answers, Azure Functions are defined for each question & answer pair in the chatbot. The specific function matched with the answer is called upon and saves the data into the database. The success or error is then communicated back to the chatbot.
For retrieving data with the chatbot, Azure Functions works similar: we define specific functions for the questions and answers of the chatbot. When the chatbot receives a certain answer, the corresponding function is called upon, and sends a search query to the database. The only difference is that now the functions send back a query result to the chatbot instead of simply an error or success notification.
Future steps: dashboarding & workplace integration
We are very proud of what we already achieved during the hackathon. However, we are not ready for a deployed version of our chatbot yet. First things first, it is important that we discuss the project with several stakeholders in Essent to see if we can integrate the chatbot in our way of working. Currently, our IT teams already use skill matrices, but as of now, there is no uniform or consistent way to create and compare these matrices. We hope our solution can help to compare skill information cross-teams and across ARTs as well. If we succeed in getting the support we need, we will continue building this solution based on the architecture we drew up, and make the bot available for everyone working in Essent IT.
Looking forward, we would also like to add a report or dashboard to our solution. As mentioned before, skill matrices are already used in the organization, so producing skill matrix visualizations could be of great value to the company. By introducing visualizations, our current skill matrices can be replaced which will subsequently help managers to get a clear overview of all available and missing skills in the teams. Going to the hackathon to learn new things was already a worthwhile adventure but implementing these new ideas at Essent would be a great achievement. Exploring possibilities like this pushes the company forward and inspires the people working there.
See you in the next Hackathon
We hope that after reading this article, you gained a bit more understanding of our project and what we built. We had much fun building it and also learned to use some new tools and even programming languages (who knew we could master C# in only 3 days?). We are very grateful to the organization of the event and to our Microsoft experts Alex and Julian, who really taught us some cool stuff in a short amount of time. If you still have any questions about our project or cool ideas about how to take this project further, please don’t hesitate to contact us.
All in all, we had a great time during the hackathon. We learned to work with some new technologies, got to know international colleagues and became closer as a Techlab team, since it was the first time, we went on such a trip together with the new team. We hope you enjoyed the read! And who knows, we might meet each other in the next Hackathon!