Sources are how events are ingested into Convoy. In this section, we explain the different types of sources and their use cases. Convoy currently supports three types of sources - REST API, HTTP Ingested and Message Brokers.

REST API

This is an authenticated API to push events to a Convoy instance. It is designed specifically for an outgoing projects to push events to a specific endpoint. This source is only and automatically available for all outgoing projects.

HTTP Ingestion

This is an unauthenticated API to receive webhook events from third-party webhook providers like Github, Shopify etc, It is designed for incoming projects to receive events from any provider. For each provider, a source needs to be configured with its necessary verification. Once configured, we provide you with a unique link to be supplied to the provider.

Verification

Source verification can be configured in four different ways on Convoy:

  • Hmac: Hmac is a common mechanism most providers support for webhooks. Providers like Shopify, Stripe etc. Creating a Hmac source looks like the below:
    create hmac source

For HMAC verification mechanism, Convoy provides support for simple and advanced signatures.

  • Basic Authentication: While not popular supported some providers user this mechanism to verify events. Creating a Basic Auth source looks like the below:

    create basic auth source

  • API Keys: Similar to Basic Authentication, API Keys while not popular are used by some providers to verify events. Providers like Mono

    create API key source

  • Custom Verification: For some providers, like Github and Twitter the core verification mechanisms aren’t sufficient. Though they are wrap around the core mechanisms, these modules have to be built specifically for eah provider.

Currently, we have support for GitHub, and have planned support for Twitter and Shopify. You can request new sources by sending an email to [[email protected]](https://getconvoy.io/cdn-cgi/l/email-protection).

Message Brokers

Message Brokers provide extra reliability gains to ingest events from backend services to dispatch to client endpoints. With this, disparate services write events to a queue or topic, then convoy reads off the queue or topic and send the events to client endpoint. It is designed for and only available to outgoing projects.

Google PubSub

To ingest events using Google PubSub, follow the steps outlined below:

  • Create a PubSub Topic

    create google pubsub topic

  • Create a Subscription

    create a subscription

  • Create a Service Account with PubSub Admin Role

    create service account

  • Generate Service Account JSON Key

    generate service account json key

  • Configure Source Supply your Project ID, Topic Name, Subscription and upload your service account json key.

  • Send Events We write JSON events into the queue with the following format:

Sample Payload
{
   "endpoint_id": "01GTBP6SX313EZN6X3QE29CW6Z",
   "event_type": "compliance.completed",
   "custom_headers": {
      "X-Event-Key": "Event XYZ"
   },
   "data": {}
}

The payload is exactly the same as the one used with our REST API.

Amazon SQS

To ingest events using Amazon SQS, follow the steps outlined below:

  • Create an IAM User for authenticating with the SQS Queue and attach the AmazonSQSFullAccess policy to the user

    create IAM user
    attach AmazonSQSFullAccess policy

  • Under the security credentials tab for the IAM user, generate a new Access Key. Take note of the Access Key and Secret Key generated

    generate a new access key
    create CLI access key

  • Create a SQS Queue and specify the ARN of the IAM user under the access policy

    create sqs queue
    add the IAM user under access policy

  • Configure Source Supply your Access Key, Secret Key, Queue Name and Default Region.

  • Send Events We write JSON events into the queue with the following format:

Sample Payload
 {
   "endpoint_id": "01GTBP6SX313EZN6X3QE29CW6Z",
   "event_type": "compliance.completed",
   "custom_headers": {
      "X-Event-Key": "Event XYZ"
   },
   "data": {}
 }

Kafka

The Kafka integration uses consumer groups for high availability and fault tolerance.

The payload is exactly the same as the one used with our REST API.