You can request new connectors directly here.
Note: Airbyte is not built on top of Singer, but is compatible with Singer's protocol. Airbyte's ambitions go beyond what Singer enables to do, so we are building our own protocol that will keep its compatibility with Singer's one.
New sources: Hive, MongoDB, Mailgun New destinations: Panoply, S3, MySQL, Kafka, MongoDB
New native Hubspot connector with schema folder populated
Facebook Marketing connector: add option to include deleted records
Bug fixes:
Google Analytics: add the ability to sync custom reports
Apple Appstore: bug fix to correctly run incremental syncs
Exchange rates: UI now correctly validates input date pattern
File Source: Support JSONL (newline-delimited JSON) format
Freshdesk: Enable controlling how many requests per minute the connector makes to avoid overclocking rate limits
1 new destination connector: MeiliSearch
Other fixes:
Thanks to @ns-admetrics for contributing an upgrade to the Shopify source connector which now provides the landing_site field containing UTM parameters in the Orders table.
Sendgrid source connector supports most available endpoints available in the API
Facebook Source connector now supports syncing Ad Insights data
Freshdesk source connector now supports syncing satisfaction ratings and conversations
Microsoft Teams source connector now gracefully handles rate limiting
Bug fix in Slack source where the last few records in a sync were sporadically dropped
Bug fix in Google Analytics source where the last few records in sync were sporadically dropped
In Redshift source, support non alpha-numeric table names
Bug fix in Github Source to fix instances where syncs didn’t always fail if there was an error while reading data from the API
Sources that we improved reliability for (and that became “certified”):
Certified sources: Files and Shopify
Enhanced continuous testing for Tempo and Looker sources
Other fixes / features:
Correctly handle boolean types in the File Source
Add docs for App Store source
Fix a bug in Snowflake destination where the connector didn’t check for all needed write permissions, causing some syncs to fail
Improved reliability with our best practices on : Google Sheets, Google Ads, Marketo, Tempo
Support incremental for Facebook and Google Ads
The Facebook connector now supports the FB marketing API v9
Our new Connector Health Status page
1 new source: App Store (thanks to @Muriloo)
Fixes on connectors:
Bug fix writing boolean columns to Redshift
Bug fix where getting a connector’s input configuration hung indefinitely
Stripe connector now gracefully handles rate limiting from the Stripe API
1 new source: Tempo (thanks to @thomasvl)
Incremental support for 3 new source connectors: Salesforce, Slack and Braintree
Fixes on connectors:
Fix a bug in MSSQL and Redshift source connectors where custom SQL types weren't being handled correctly.
Improvement of the Snowflake connector from @hudsondba (batch size and timeout sync)
Fixes on connectors:
Fixed a bug in the github connector where the connector didn’t verify the provided API token was granted the correct permissions
Fixed a bug in the Google sheets connector where rate limits were not always respected
Alpha version of Facebook marketing API v9. This connector is a native Airbyte connector (current is Singer based).
New sources: Plaid (contributed by tgiardina), Looker
New sources: Drift, Microsoft Teams
New sources: Intercom, Mixpanel, Jira Cloud, Zoom
New sources: Slack, Braintree, Zendesk Support
New sources: Redshift, Greenhouse New destination: Redshift
New sources: Freshdesk, Twilio
New source: Recurly
New source: Sendgrid
New source: Mailchimp
New source: MSSQL
New source: Shopify
New sources: Files (CSV, JSON, HTML...)
New sources: Facebook Ads, Google Ads, Marketo New destination: Snowflake
New sources: Salesforce, Google Analytics, Hubspot, GitHub, Google Sheets, Rest APIs, and MySQL
New destinations: we built our own connectors for BigQuery and Postgres, to ensure they are of the highest quality.
New sources: Stripe, Postgres New destinations: BigQuery, Postgres, local CSV