Airbyte supports two types of connectors: Sources and Destinations. A connector takes the form of a Docker image which follows the Airbyte specification.
To build a new connector in Java or Python, we provide templates so you don't need to start everything from scratch.
Note: you are not required to maintain the connectors you create. The goal is that the Airbyte core team and the community help maintain the connector.
You can build a connector very quickly in Python with the Airbyte CDK, which generates 75% of the code required for you.
Before building a new connector, review Airbyte's data protocol specification.
To add a new connector you need to:
Implement & Package your connector in an Airbyte Protocol compliant Docker image
Add integration tests for your connector. At a minimum, all connectors must pass Airbyte's standard test suite, but you can also add your own tests.
Document how to build & test your connector
Publish the Docker image containing the connector
Each requirement has a subsection below.
If you are building a connector in any of the following languages/frameworks, then you're in luck! We provide autogenerated templates to get you started quickly:
Python Source Connector
Singer-based Python Source Connector. Singer.io is an open source framework with a large community and many available connectors (known as taps & targets). To build an Airbyte connector from a Singer tap, wrap the tap in a thin Python package to make it Airbyte Protocol-compatible. See the Github Connector for an example of an Airbyte Connector implemented on top of a Singer tap.
Generic Connector: This template provides a basic starting point for any language.
Java Destination Connector
Python Destination Connector
Run the interactive generator:
and choose the relevant template by using the arrow keys. This will generate a new connector in the
Search the generated directory for "TODO"s and follow them to implement your connector. For more detailed walkthroughs and instructions, follow the relevant tutorial:
As you implement your connector, make sure to review the Best Practices for Connector Development guide. Following best practices is not a requirement for merging your contribution to Airbyte, but it certainly doesn't hurt ;)
At a minimum, your connector must implement the acceptance tests described in Testing Connectors
Note: Acceptance tests are not yet available for Python destination connectors. Coming soon!
If you're writing in Python or Java, skip this section -- it is provided automatically.
If you're writing in another language, please document the commands needed to:
Build your connector docker image (usually this is just
docker build . but let us know if there are necessary flags, gotchas, etc..)
Run any unit or integration tests in a Docker image.
Your integration and unit tests must be runnable entirely within a Docker image. This is important to guarantee consistent build environments.
When you submit a PR to Airbyte with your connector, the reviewer will use the commands you provide to integrate your connector into Airbyte's build system as follows:
:airbyte-integrations:connectors:source-<name>:build should run unit tests and build the integration's Docker image
:airbyte-integrations:connectors:source-<name>:integrationTest should run integration tests including Airbyte's Standard test suite.
Typically this will be handled as part of code review by an Airbyter. There is a section below on what steps are needed for publishing a connector and will mostly be used by Airbyte employees publishing the connector.
The steps for updating an existing connector are the same as for building a new connector minus the need to use the autogenerator to create a new connector. Therefore the steps are:
Iterate on the connector to make the needed changes
Add any needed docs updates
Create a PR to get the connector published
Once you've finished iterating on the changes to a connector as specified in its
README.md, follow these instructions to ship the new version of the connector with Airbyte out of the box.
Bump the version in the
Dockerfile of the connector (
Update the connector definition in the Airbyte connector index to use the new version:
airbyte-config/init/src/main/resources/seed/source_definitions.yaml if it is a source
airbyte-config/init/src/main/resources/seed/destination_definitions.yaml if it is a destination.
Update the connector JSON definition. To find the appropriate JSON file to update, find a JSON file
<uuid>.json where the UUID portion is the ID specified in the YAML file you modified in step 2. The relevant directories are:
airbyte-config/init/src/main/resources/config/STANDARD_SOURCE_DEFINITION/<uuid>.json for sources
airbyte-config/init/src/main/resources/config/STANDARD_DESTINATION_DEFINITION/<uuid>.json for destinations
Submit a PR containing the changes you made.
One of Airbyte maintainers will review the change and publish the new version of the connector to Docker hub. Triggering tests and publishing connectors can be done by leaving a comment on the PR with the following format (the PR must be from the Airbyte repo, not a fork):
# to run integration tests for the connector# Example: /test connector=connectors/source-hubspot/test connector=(connectors|bases)/<connector_name># to run integration tests and publish the connector# Example: /publish connector=connectors/source-hubspot/publish connector=(connectors|bases)/<connector_name>
The new version of the connector is now available for everyone who uses it. Thank you!
In order to run integration tests in CI, you'll often need to inject credentials into CI. There are a few steps for doing this:
Place the credentials into Lastpass: Airbyte uses a shared Lastpass account as the source of truth for all secrets. Place the credentials exactly as they should be used by the connector into a secure note i.e: it should basically be a copy paste of the
config.json passed into a connector via the
--config flag. We use the following naming pattern:
<source OR destination> <name> creds e.g:
source google adwords creds or
destination snowflake creds.
Add the credentials to Github Secrets: To inject credentials into a CI workflow, the first step is to add it to Github Secrets, specifically within the "more-secrets" environment. Admin access to the Airbyte repo is required to do this. All Airbyte engineers have admin access and should be able to do this themselves. External contributors or contractors will need to request this from their team lead or project manager who should have admin access. Follow the same naming pattern as all the other secrets e.g: if you are placing credentials for source google adwords, name the secret
SOURCE_GOOGLE_ADWORDS_CREDS. After doing this step, the secret will be available in the relevant Github workflows using the workflow secrets syntax.
Inject the credentials into test and publish CI workflows: edit the files
.github/workflows/test-command.yml to inject the secret into the CI run. This will make these secrets available to the
During CI, write the secret from env variables to the connector directory: edit
tools/bin/ci_credentials.sh to write the secret into the
secrets/ directory of the relevant connector.
That should be it.