AWS Only

Anthropic Models UsedRetailAI & ML

Scaling Email Data Extraction Using LLMs at Foxintelligence

Fox Intelligence, an organisation rapidly advancing into AI/ML, sought a solution to automate and streamline their internal operations. We proposed a proof-of-concept leveraging a retrieval-augmented generation (RAG) approach. The solution ingested and interpreted Foxintelligence's email data using large language models and semantic search, providing a prompt-based UI that allowed their team to easily extract key information and insights.

AWS ONLY ·

Scaling Email Data Extraction Using LLMs at Foxintelligence

Fox Intelligence, an organisation rapidly advancing into AI/ML, sought a solution to automate and streamline their internal operations. We proposed a proof-of-concept leveraging a retrieval-augmented generation (RAG) approach. The solution ingested and interpreted Foxintelligence's email data using large language models and semantic search, providing a prompt-based UI that allowed their team to easily extract key information and insights.

At a glance


Foxintelligence measure and improve the performance of digital brands and retailers, by delivering the most innovative and privacy-first market and shopper insights in the world. They are part of NielsenIQ family of businesses.

Challenge

Foxintelligence was struggling to efficiently process and extract insights from their unstructured data, such as customer emails. Their manual, time-consuming processes were limiting their ability to scale operations without increasing headcount.

Solution

Firemind proposed a proof-of-concept solution that leveraged AWS services such as Amazon Kendra, Amazon Bedrock, and a serverless architecture to address Foxintelligence's challenges. The solution ingested and interpreted Foxintelligence's unstructured data, providing a prompt-based interface to extract insights and automate manual tasks, enabling them to scale operations without increasing headcount.

Services Used

Amazon Bedrock
Amazon Kendra
AWS Lambda
Amazon API Gateway

Outcomes

Enhanced scalability as the LLM-based solution replaced 6000 parsers, greatly improving the system's ability to handle new email formats and languages.
Greater data accuracy using Amazon Bedrock LLMs that improve the precision and consistency of data extraction from email receipts.

Business challenges

Automating email data extraction for greater efficiency

Foxintelligence was facing challenges in efficiently processing and extracting insights from their unstructured data, particularly customer email communications. Their manual, time-consuming processes were hindering their ability to scale operations without increasing headcount and full-time employee (FTE) hours. They sought a solution to automate these tasks and free up their team to focus on higher-value work.

“In the first half of 2024, Firemind supported us in exploring and evaluating the cost-effectiveness of AWS Bedrock Large Language Models (LLMs) for our information extraction tasks on larger documents, while keeping or improving the quality. During this proof-of-concept, Firemind explored multiple LLMs and various usage strategies, balancing token usage and latency.

The outcomes were great: specific AWS Bedrock LLMs delivered a 40% cost reduction and twice the speed (at similar quality levels), compared to our current third-party LLM provider. Firemind's expertise and comprehensive approach have unlocked new technical opportunities for us that are both scalable and budget friendly.”

Mihai Rotaru, Head of Research and Development — TextKernel

Solution

AI-powered insight extraction made easy

Firemind engaged Foxintelligence to address their challenges in efficiently processing and extracting insights from their unstructured data, particularly customer email communications. Foxintelligence, an innovative organisation rapidly advancing into the world of AI and machine learning, had an internal innovation department that was exploring the potential of these technologies to improve their operations and to gain a competitive edge.

Firemind proposed a comprehensive proof-of-concept solution leveraging various AWS services to create a scalable, serverless system. The first step was to establish a robust data ingestion pipeline that could handle any text documents, including unstructured data. This was achieved by utilising Amazon S3 to ingest and store the documents, such as customer emails, as they were uploaded.

To enable smart semantic search capabilities, Firemind implemented Amazon Kendra, which allowed the large language model to efficiently retrieve the most relevant documents from the data repository. This was a crucial component, as it provided the foundation for the chain-of-thought reasoning process that was at the heart of the solution.

The user interface was developed using React and hosted on Amazon S3, with the content delivered through a CloudFront distribution. This frontend was secured using AWS WAF and integrated with Amazon Cognito, ensuring that only authenticated users could access the system. The UI provided a prompt-based experience, allowing Foxintelligence’s team to easily enter queries and retrieve responses from the large language model.

At the core of the solution was the use of the Claude V2 model, hosted on Amazon Bedrock. This powerful large language model was leveraged to reason over the data retrieved by the semantic search, providing contextual responses to the user prompts. The combination of Kendra’s search capabilities and the LLM’s reasoning abilities enabled Foxintelligence’s team to extract valuable insights from their unstructured data, automating manual tasks and freeing them up to focus on higher-value work.

The entire system was designed with a serverless architecture, utilising AWS Lambda and Amazon API Gateway to handle the backend processing and API requests. This approach ensured scalability, cost-efficiency, and ease of maintenance, aligning with Firemind’s commitment to AWS best practices and methodologies.

Through this proof-of-concept, Firemind demonstrated how Foxintelligence could leverage the power of generative AI and AWS services to streamline their internal operations and scale their business without the need to increase their full-time employee headcount. The solution provided a flexible and secure platform that empowered Fox Intelligence’s team to extract insights and automate manual tasks, ultimately driving greater efficiency and productivity within the organisation.

Streamlined data processing

Firemind's solution allowed Foxintelligence to automate the ingestion and interpretation of their unstructured data, such as customer emails, freeing up their team to focus on more strategic initiatives. The integration of AWS services like Amazon Kendra and Amazon Bedrock provided a scalable and efficient way to extract valuable insights from the data.

Increase in productivity

By implementing a prompt-based user interface, Firemind empowered Foxintelligence's team to easily interact with the large language model and retrieve relevant information. This enabled them to complete manual tasks more quickly and with greater accuracy, leading to a significant increase in productivity and a reduction in headcount requirements.

Why Firemind

Firemind was selected as the partner for this project due to its deep expertise in AWS services, which was crucial for delivering an effective solution for automating and analysing sales call data. Their extensive experience with technologies like Amazon Transcribe, Comprehend, Bedrock, and QuickSight ensured that they could leverage these tools to meet Consumer Energy Solutions’ specific needs.

Additionally, Firemind’s proven track record with high-profile clients such as the Premier League and Vodafone demonstrated their ability to deliver high-quality, tailored solutions. This history of success likely gave Consumer Energy Solutions confidence in Firemind’s capability to address their challenges effectively and efficiently.

Firemind’s specialisation in data, machine learning, and generative AI also made them a strong fit for the project. Their focus on these areas was essential for building a solution that could efficiently transcribe, analyse, and generate actionable insights from sales calls.

Moreover, Firemind’s local presence in the UK allowed for onshore support and a more personalised approach. Their ability to offer tailored solutions that aligned with CES’s operational requirements further solidified their position as the ideal partner for this project.

Get in touch

Want to learn more?

Seen a specific case study or insight and want to learn more? Or thinking about your next project? Drop us a message below!