Insights
Why AI companies are racing to build integrations
AI companies are aggressively investing in building integrations for their AI products, so they can ingest knowledge from their users' 3rd party apps and files, and give their AI agents access to automate tasks across the stack. This article dives into the use cases and why there is such strong sense of urgency right now.
Brian Yam
,
Head of Growth
12
mins to read
Intro
Most B2B SaaS companies are evaluating how they can capitalize on the Gen-AI wave, and many have realized just how important integrations are to their AI product strategy. The two foundational roles that integrations play within the context of AI and RAG (retrieval augmented generation) are:
Ingesting data from external sources for contextual-awareness
Enabling your AI agent products/features to take action across applications
But to understand the importance of these two components, we first need to step back and understand what goes into build useful AI products to begin with.
AI Applications vs. Employees
We can all agree that the goal of any B2B AI product is to replace or streamline the jobs of real humans, aka employees.
As such, I like to think about B2B AI product development through the lens of reverse-engineering the make up of high-performing employees.
It's not just the fact that they have a breadth of knowledge on their company/market and a depth of knowledge on their discipline. It’s their ability to apply that knowledge in the right contexts, and take meaningful action to solve specific problems.
Let’s pretend you’re building an AI sales rep product just to make this more tangible. What are the ingredients that makes someone a really good rep?
Context & Knowledge:
The best reps deeply understand the products they’re selling, the pain points and value props the products offer, and the common objections and how to handle them. Beyond that, they also need to be aware of any historical conversations that have taken place with a given prospect or customer. In real life, a rep would build up this context over months of onboarding, through reading sales enablement documents in Google Drive/Notion, asking questions in Slack threads, and reviewing their CRM for previous activities and touchpoints.
Building an AI product that can do that same job effectively would require you to equip your AI product (via RAG) with all of this rich contextual data that is spread across your users’ external apps and files.
Informed Action:
Being ‘all-knowing’ is a great start, but no one likes a smart employee who doesn’t actually do anything. The best reps execute consistently and methodically - they leverage all the context they have to inform what they should do, whether that be writing the perfect sales emails or building a business case to convince prospects to buy their product. Translating this over to our AI sales rep - once it is armed with all of the relevant context on our customers’ businesses, the AI also needs to be able to take action. This could be as simple as updating the CRM with notes from a sales call, or as complex as crafting highly personalized emails to prospects and/or calling them using a voice API in order to drive demos.
Now that we understand the building blocks of a useful AI product, the question then becomes: How does AI access this wealth of information and execute these actions across disparate systems? The answer lies in robust integrations.
Ingesting Context
All of the necessary knowledge and context AI products need is scattered across the dozens of apps teams use every single day. There are Slack conversations about almost any topic, project statuses in project management tools, sales data in CRMs, strategy or onboarding docs in file storage systems like Google Drive, and even important insights in sales calls recorded by Gong.
To get access to all of this knowledge, the engineering teams at AI companies need to build integrations and ingestion pipelines to extract this data from each of their customers’ 3rd party apps (for which there are dozens). Not only will this need to pull all historical data, but also any changes (new, updated, or deleted data) also need to be pulled.
Note: We put together a detailed tutorial on how AI platforms can use Paragon as the single point of access to build scalable ingestion pipelines for apps like Slack, Google Drive, and Notion.
The importance of real-time ingestion
Unlike most things, the 80/20 rule doesn’t apply too well to context coverage. In fact, in many use cases, especially anything customer-facing, missing any piece of context, even if it’s just from the last hour, can result in terrible outcomes.
Here’s an example - your AI sales rep is about to send a contract for signature to a prospect, unaware that the prospect just raised a major issue in a support ticket five minutes ago and is threatening to cancel the evaluation. Imagine how mad that prospect would be seeing that the ticket was not fixed and the ‘sales rep’ pretended everything was going smoothly.
Real-time integrations (powered by webhooks) are critical for ensuring that your AI application always has access to all of the context and information, minimizing the risk of missteps and maximizing the relevance of its actions.
Agentic Actions
Actions are the second phase, where AI products are building AI agents that can take action on behalf of users across 3rd party apps.
For an AI SDR product, this could include sending emails, creating Google Calendar invites for meetings, or sending Slack notifications. For an AI meeting assistant, this could be creating tasks in Asana from action items mentioned in a call, or updating the user’s CRM based on insights from a sales call.
In order to enable the AI agents to perform these tasks, your engineering team would have to build out, define, and maintain every individual 3rd party action across all of the different applications.
Note: We also built an agentic AI demo and tutorial that showcases how an AI chatbot can perform these 3rd party tasks based on a natural language input.
Fun fact: This AI agent space is growing so rapidly that some are coining it service-as-a-software (a play on words on software-as-a-service).
A few real-life examples
Let’s take a look at how AI companies are tackling this today.
Example 1 - Intercom
Intercom recently launched Fin, their AI copilot for support agents.
Context:
When they initially launched, it had one job - to provide suggested responses to queries for prospects. Many of their users also use Zendesk, which is why they have an integration that will ingest all of their users’ historical Zendesk conversations.
However, outside of Zendesk, they have a huge opportunity to make it easier for users to provide additional sources of knowledge - currently it’s limited to manually uploading files.
Action:
The team also quickly realized that just providing answers from a knowledge-base was not enough. To drive significant ROI, they need to replace the work required for support agents to go look for information in 3rd party apps (such as subscription data in Stripe or order/inventory data in Shopify), and have begun building out ‘AI Actions’ for Fin.
Example 2 - Frame
Frame is an AI-powered collaboration suite (think Notion but built for AI from day 1) that provides AI employees with access to data from different applications, and they use Paragon to power all the data ingestion pipelines from external apps, as well as the agentic actions that can be performed by their product’s AI employees.
Context
Frame’s AI employees need to be context-aware - their AI product managers need to have insight into all the product specs in Confluence and Notion, and all the issues in Jira.
Automation
Once the AI employees become aware of all of the organization’s data across 3rd party apps, it can begin to take action such as updating Airtable entries or Jira issues.
Check out how they shipped one integration per week with a single engineer in this case study.
Example 3 - Notion
Context
Notion recently launched Notion AI, their Q&A bot that can ingest data from users’ Slack and Google Drive. This enables it to answer questions by retrieving relevant information from these external sources (and across Notion data of course).
Action
Notion has not introduced any actions into their app as of now, although I suspect this will be in their long term roadmap.
A competitive edge - for now
With any major trend, the early opportunists will benefit significantly more than those who are reactive. HubSpot, Slack, Klaviyo, and Atlassian are a few examples of companies that invested in integrations early on back when ecosystems plays were novel, and they’ve built humongous moats around their ecosystems that have been largely untouchable.
As for the rest of the companies, building integrations has become a core requirement and the price of entry when going to market. That’s why most SaaS startups today can’t even get their first few customers without a few core integrations with their customers’ systems of records.
The same is happening in AI - that’s why investors are pouring 9-10 figures into enterprise AI companies that are focused on scaling their integrations, as we’re in a land grab moment for data in this gen-AI wave.
And it makes sense why there’s so much convergence around context ingestion use cases. If you think about it, an AI that seamlessly connects with all of a company's tools and data becomes more than just a SaaS product that you can rip and replace—it becomes an indispensable part of customers’ operational fabric and sets the foundation to unlock many additional use cases.
Challenges with integrations for AI & RAG
Hopefully at this point we’re all in agreement about the importance of integrations for AI. But then comes the technical challenges with implementing said integrations.
Context is everywhere, and there’s a lot of it.
To be able to ingest all of your customers’ Google drive files, CRM data, Slack messages, etc. is actually a very big engineering challenge.
We outlined those challenges in this “Challenges with High Volume Ingestion” article in more detail, but TL;DR:
Lots of data (high volume or large or both)
Many types of data
Your infrastructure needs to be able to handle all of this ingestion
3rd party rate limits can lead to data loss
Not only that, there are many AI actions that you’ll need to build for your product’s AI agents to automate tasks across the 3rd party apps will require you to research every 3rd party API, manage the authentication, monitor breaking changes, and a lot more it can cost a lot of engineering hours to build and maintain.
This is why over 100+ SaaS companies including AI21, Copy.ai, and Frame rely on Paragon as their integration infrastructure. Paragon handles all the complexities around integrations, including authentication, listening to 3rd party webhooks, scaling to handle high volume ingestion, monitoring, and much more, so engineering teams can focus on their core product.
Where to from here?
The race is on. AI companies that are capitalizing on the value of integrations across both knowledge ingestion and automation today will be one step ahead of the rest. Eventually the market will catch up, just like in traditional SaaS, but by then these context-aware AI agents will be extremely embedded in their customers’ workflows, making them extremely hard to displace. It will be no different from how Salesforce can maintain their market leading position despite having a very subpar product.
If you want to get ahead and start scaling your AI product’s integrations today without having to hire entire integrations engineering teams, book a demo with Paragon.