Platform

Use Cases

Developers

Resources

Build vs Buy

Build vs Buy: Authentication for Integrations

Learn what it's like building authentication for integrations from scratch versus with Paragon.

Is it really that hard to build my own integrations?

It’s a question I’ve been wondering since I started working at Paragon. And unless you’ve taken a stab at building integrations, working with OAuth flows, and interacting with 3rd-party APIs, you’ve probably wondered the same thing. As a developer advocate, I don’t work on our main product and am not tasked with building out integrations on our platform. So to get a better perspective on the build vs buy decision that our prospects and customers go through, I decided to embark on the journey of building integrations from scratch and compare my experience to building with Paragon.

Building In-House: The Project Plan

Building out integrations involves a lot of components - from user authentication to working with different 3rd-party APIs to building infrastructure for large jobs. These are topics we’ll cover in our build vs buy series. For this exercise, I focused on building out just the authentication piece (which already has its list of challenges) for three integrations - Google Drive, Slack, and Salesforce.

I wanted to put myself in our customer’s shoes. I built out native and multi-tenant supported integrations for my demo SaaS application, that I named ParaHack, meaning my application needed to:

  • allow users to authenticate and authorize their 3rd-party accounts from within my application

  • handle multiple tenants/users and keep their credentials and data separate

  • provide a native integration experience where users have their 3rd-party credentials saved so they don’t constantly need to re-auth and our application can interact with the 3rd-party API over and over

After building these native, multi-tenant integrations, I also implemented an extremely thorough, fool-proof, robust testing system (AKA one simple API request per integration to prove the authentication process is working).

With these requirements in place, I mapped out this project architecture:

Building In-House: The Process

Step 1: Integration Provider Documentation

Reading external documentation is one of those things that I love and hate. I love it when the docs answer all my burning questions and are easy to follow. I hate when I struggle to find opaque error codes, spend hours trying things, only to find out I skipped over a sentence in the docs written in size 10 font.

Although OAuth is a standard of sorts, each integration provider will have different hoops they require client applications to jump through. Some key pieces of information I learned to look out for in 3rd-party OAuth documentation as I read through Google, Slack, and Salesforce’s references are:

  • how to register our app in their developer console

  • the necessary scopes and configurations we need to enable and request

  • the endpoints to retrieve access and refresh tokens

  • the format to interact with their APIs (json vs form-urlencoded vs URL parameters)

  • token expiration and revoke policies

Step 2: App Registration

For each integration provider, I went to their developer console to create an external client app. While not difficult per se, there were many nuances I had to learn along the way, such as the fact that certain scopes were necessary to have permission to request refresh tokens (important if you don’t want your users to log into their accounts every time your application attempts to send an API request).

The developer console is also where I was able to get the client ID and client secret necessary for sending requests to the integration provider during the OAuth flow.

const initiateDriveOauth = () => {
    const params = new URLSearchParams({
        client_id: process.env.NEXT_PUBLIC_GOOGLE_CLIENT_ID ?? "",
        redirect_uri: getBackendOrigin() + "/oauth/googledrive",
        response_type: 'code',
        scope: '<https://www.googleapis.com/auth/drive.file>',
        include_granted_scopes: 'true',
        state: 'pass-through value',
        access_type: "offline",
        prompt: "consent"
    }).toString();

    window.location.href = "<https://accounts.google.com/o/oauth2/v2/auth?"> + params;
};

Step 3: Client/Server Code Implementations

Revisiting the architecture, we can see that our application has to be an orchestrator for our user and their integration providers. For integrations to work, it’s a constant back and forth to consistently have valid access tokens to interact with our users’ Google Drive, Slack, or Salesforce API.

From a high level, I didn’t think it looked too difficult to build an app to redirect the user to a 3rd-party login page, wait for a redirect back to our application with an access code, and then start requesting tokens. In practice, this process wasn’t always easy to implement.

One example, nowhere in Salesforce’s refresh token documentation does it say what the request body format should be. I assumed if data is sent back in json, the request parameters should also be sent in json format. Only by perusing a Salesforce’s stackexchange post was I able to find that the request body must be in x-www-form-url-encoded format. Shout out to the stackexchange user: metadaddy. Absolute hero.

Another challenge of building integrations from different providers is that each have different token expiration policies. Google’s API was very explicit with when tokens expired, allowing me to keep that expiration date in a database, and only request refreshes when needed.

const driveCreds = await getDriveCredentialByEmail(response.email);
if (driveCreds[0] && new Date(Number(driveCreds[0].access_token_expiration)) < new Date()) {
    console.log("need refresh");
    refreshDriveAccessToken(driveCreds[0], response.email);
}

For Salesforce, it’s not well documented when access tokens expire. Try googling “salesforce access token expiration.” All of the top results are blog posts or forums.

I decided to implement a different pattern where I would try to refresh my access token after encountering a 401 status code.

let salesforceCreds = await getSalesforceCredentialByEmail(response.email);
let accountResponse = await getAccounts(salesforceCreds[0]);

if(accountResponse.status === 401){
    console.log("refreshing");
    const refreshed = await refreshSalesforceToken(salesforceCreds[0]);
    if(refreshed){
        salesforceCreds = await getSalesforceCredentialByEmail(response.email);
        accountResponse = await getAccounts(salesforceCreds);
    }
}

Step 4: Storing Credentials and Tokens

I needed a database to store access and refresh tokens so my users don’t need to re-authenticate constantly with their integration providers. For each integration, I needed a separate schema to store different fields per integration credentials.

For example in Slack, I just needed to store access tokens, since Slack does not have token rotation enabled by default and as a result, don’t have expiration on their access tokens.

export async function insertSlackCredential(credential: SlackCredential){
    const db = new sqlite3.Database("./credentials.db", sqlite3.OPEN_READWRITE);
    const sql = 'INSERT INTO slack_credentials (id, email, access_token) VALUES (?,?,?)';
    try {
        await insertUpdate(db, sql, [credential.id, credential.email, encrypter.encrypt(credential.access_token)]);
    } catch (err) {
        console.log(err);
    } finally {
        db.close();
    }
}

For Salesforce, I needed to store different fields. This time, I did need a refresh token - since access tokens expire - as well as an “instance URL” to keep track of a user’s Salesforce subdomain.

export async function insertSalesforceCredential(credential: SalesforceCredential){
    const db = new sqlite3.Database("./credentials.db", sqlite3.OPEN_READWRITE);
    const sql = 'INSERT INTO salesforce_credentials (id, email, access_token, refresh_token, instance_url) VALUES (?,?,?,?,?)';
    try {
        await insertUpdate(db, sql, [credential.id, credential.email, encrypter.encrypt(credential.access_token), encrypter.encrypt(credential.refresh_token), credential.instance_url]);
    } catch (err) {
        console.log(err);
    } finally {
        db.close();
    }
}

Although most integration providers have similar mechanisms, I definitely realized that it’s hard to reuse code and build out new integrations quickly as each integration provider implements their OAuth and APIs differently.

Step 5: Testing

After all the long nights of reading documentation, debugging Google and Slack and Salesforce APIs, refreshing tokens, it was all worth it because my application, ParaHack, can now do this:

And this:

And this:

Buying: The Process

OK, phew. My super “robust” end-to-end testing was a success.

A logical next question is: What’s it like building authentication with Paragon?

Embedded iPaaS products like Paragon have pre-built UI components (as well as headless implementation if you prefer building your own UI) that fully manages authentication for integrations within your application. Paragon will store all of your users’ 3rd-party tokens, handle the refreshes and all other authentication-related nuances.

While you still need to register your app (step 2 above) with each 3rd-party provider, the actual authentication process (steps 3 and 4) is extremely streamlined with Paragon. Simply add the integration Connect Portal components directly into your application to kick off the OAuth flow. Paragon handles the rest of the authentication - the OAuth flow, storing credentials, and retrieving the right token per user for subsequent API requests. All it takes is a few lines of code using Paragon’s node.js SDK to your frontend.

<Button onClick={() => paragon.connect(integration.type, {})}>
  {integrationEnabled ? "Manage" : "Enable"}
</Button>

Because this article is just focusing on the authentication aspect of integrations, we’ll end it there. If you’re interested how Paragon fits into your backend to perform API requests, read this article on how Paragon fits into your product tech stack.

Wrapping Up

All jokes aside, building the authentication for integrations in-house was a good experience for me to learn about the ins and outs of the OAuth flow and the nuances of working with 3rd-party APIs. Although not impossible by any means, building authentication is a non-trivial amount of work. It takes careful documentation reading and some thought on how best to design your application for handling different token policies, different data fields coming back, and different API behaviors across integration providers.

I hope this article was enjoyable and informative. I’ll be writing more of these build vs buy articles focused on building actual use cases - like data ingestion and bidirectional sync - from scratch and then with Paragon. Stay tuned if you enjoyed this format!

In the meantime, for more information on integrations, check out our AI with integrations tutorial, our build vs buy article, and other articles on our blog.

TABLE OF CONTENTS
    Table of contents will appear here.
Jack Mu
,

Developer Advocate

mins to read

Ship native integrations 7x faster with Paragon

Ready to get started?

Join 150+ SaaS & AI companies that are scaling their integration roadmaps with Paragon.

Ready to get started?

Join 150+ SaaS & AI companies that are scaling their integration roadmaps with Paragon.

Ready to get started?

Join 150+ SaaS & AI companies that are scaling their integration roadmaps with Paragon.

Ready to get started?

Join 150+ SaaS & AI companies that are scaling their integration roadmaps with Paragon.