In software applications, the key processes, such as code builds and tests, can be automated and streamlined using Continuous integration and delivery (CI/CD) and DevOps practices, to offer superior business outcomes such as improved productivity, performance efficiency, speed to market and reduced cycle times.
Usually, continuous integration and delivery and the cloud are a natural match. While the cloud replaces manual installation of resources and physical infrastructure, continuous integration automates building, testing and application code deployment. Through a suitable combination of these two, a good number of manual and time-consuming processes in software applications could be minimized.
While CI/CD can be employed for most cloud related services, CI/CD automation support is not natively available for all services provided in the cloud. A case in point is the Amazon Lex service, which is widely used in building chatbots. Lex utilizes NLU (Natural Language Understanding) and NLP (Natural Language Processing) to recognize human language and conversational intent, support continuous learning, and offer personalized responses to customers.
However, Lex lacked the provision and the infrastructural environment to use CI/CD automation. This is because the integrating APIs, together with infrastructural and operational requirements to power them, were unavailable. Given that the future of business is conversational, the lack of API support for CI/CD automation is a huge hinderance in the quick & efficient development of chatbots for business applications. Even with CI/CD integration, the right API had to be created from scratch, or selected from a pre-built list, such that the deployed chatbots could function effectively.
Chatbots: The bridge between data and decision:
According to a Gartner prediction, by 2020, Chatbots will see a multitude of applications across industries and domains and will dominate the business landscape in the near future. A specialized use case of a chatbot is a Conversational Business Intelligence Bot, which “talks” to an organization’s data and provides customized business insights. Business users could hold a conversation with this bot, to get insights and make decisions on the move, whether in the board room or on-the-go, far away from data analysts, effectively reducing the gap between data and decision.
When a huge potential for BI Bots was uncovered, Agilisium worked on a CI/CD based approach to reduce the development time of a new BI Bot, for a specific client. This was an ambitious attempt, given that Lex lacked the provision and the infrastructural environment to use CI/CD automation.
The challenges faced in this attempt and the solutions that were developed, are detailed below.
Challenge 1: Lack of Cloud Native Support for Lex
AWS Cloud Formation, an infrastructure provisioning service, manages AWS resources through well-defined code templates, which are created using simple text files. Unlike other AWS services, CloudFormation Templates (CFTs) do not currently offer native support for Lex, and a custom application had to be developed, to provision and spin-up the Lex resources across regions and accounts.
Solution: To effectively bridge the gap between CloudFormation and Lex and offer continuous support, Lex resources were configured with AWS Lambda serverless compute service, to run the necessary codes from CloudFormation Templates and handle the bots. AWS Lambda also handles parallel execution of code and supports continuous scale-up features, with the available compute resources. Lambda’s fulfillment codes, which operate through custom code triggering, can simplify automatic scale up and back-ups handling, with zero administrative efforts.
Challenge 2: Manual Handling of the Bot Training Statements and Slots
The addition of training statements into the Lex chatbot through suitable APIs and GUIs and the initial infrastructure set up activities were carried out manually, which is error-prone. It required specially designed APIs to program the user interface and handle the statements. One such example was seen in filling slots in SQL queries. Here, the chatbot takes in slot values in the conversation, and once filled, it fires API calls to get instant responses. An ideal chatbot could be trained on real conversations to understand, reason and generate answers through user conversations, but this is time consuming.
Solution: To replace the manual operations involved in handling the intense training statements and slots, Amazon Lex, a service for building conversational interfaces, was deployed, to provide secure and easy to use automated solutions. Lex supports Lambda integration for data retrieval, updates and business logic execution, and through its easy-to-use console, Lex can create chatbots in minutes and build conversational interfaces into the application. In place of APIs, these resources were separately configured using CFTs, to simplify the hardware and infrastructure provisioning activities required to power the bots.
Challenge 3: Bot susceptibility to intermittent failures
While adding training statements or bot inputs into Lex, if the intent name underwent repetition, the entire system fails. To ensure a secure and successful retrieval of data, an End to End Roll Back System had to be administered to handle the possible failures. This has been identified by Agilisium, through testing activities carried out in multiple scenarios.
Solution: Amazon Redshift, a fast and scalable data warehouse and database service in the cloud, is extremely fault-tolerant and robust, and offers secure data storage. For chatbots, Redshift offers secure storage of user data and training statements to be stored as database queries, to prevent the occurrence of duplicate data. Additionally, Redshift provides less waiting time to set up and deploy, offering cost-effective, scalable and secure query support into the data lakes. When coupled with Cloud Formation, Redshift carefully manages Lex resources at the time of application rollback, and serves as a non-erasable data source, to store and save crucial data.
Using the above-mentioned CI/CD approach, Agilisium has developed “Analytics Anywhere” Conversational BI Bot, which receives queries from users through chat and provides business insights. This chatbot provides human-like responses with a mix of visuals and text to business users.
The entire BI chatbot application was configured using CloudFormation Templates, created using JSON and YAML scripts, and this was bundled into an Amazon Quick Start package. Through this package, a stable, mature and easy-to-deploy chatbot solution was created for use in businesses.
Chatbot Quick Start
For organizations interested to try out this chatbot solution, Agilisium has launched its Conversational BI Bot Solution, “Analytics Anywhere” in AWS Quick Start. In less than 25 minutes, organizations can now experience the power this solution, that can help them get insights to pre-defined critical business questions.
AWS Quick Starts are accelerators that contain templates and deployment guides, that organizations can quickly deploy and experience a Partner’s solution on AWS environment. They are rigorously tested for AWS best practices, security and high availability.
While organizations can play around with preloaded data in Agilisium’s Quick Start, they can also use their own data and experiment with the Quick Start by following the instructions in the deployment guide.
For additional information on Agilisium’s Quick Start deployment, please click this link: https://aws.amazon.com/quickstart/architecture/agilisium-conversational-bi-bot/
The lack of a single source of truth, quality data and ad hoc manual reporting processes undermined top management’s visibility of integrated insights on sales, sales rep interactions, marketing reach, brand performance, market share, and territory management. Understandably, the client wanted to align information that has hitherto been in silos, to gain a 360-degree product movement view, to optimize sales planning and gain competitive edge.