This library provides a Dagster integration with Airbyte.
The Airbyte Connection ID that this op will sync. You can retrieve this value from the “Connections” tab of a given connector in the Airbyte UI.
The time (in seconds) that will be waited between successive polls.
Default Value: 10
The maximum time that will waited before this operation is timed out. By default, this will never time out.
Default Value: None
If True, materializations corresponding to the results of the Airbyte sync will be yielded when the op executes.
Default Value: True
If provided and yield_materializations is True, these components will be used to prefix the generated asset keys.
Default Value: [‘airbyte’]
Executes a Airbyte job sync for a given connection_id
, and polls until that sync
completes, raising an error if it is unsuccessful. It outputs a AirbyteOutput which contains
the job details for a given connection_id
.
It requires the use of the airbyte_resource
, which allows it to
communicate with the Airbyte API.
Examples:
from dagster import job
from dagster_airbyte import airbyte_resource, airbyte_sync_op
my_airbyte_resource = airbyte_resource.configured(
{
"host": {"env": "AIRBYTE_HOST"},
"port": {"env": "AIRBYTE_PORT"},
}
)
sync_foobar = airbyte_sync_op.configured({"connection_id": "foobar"}, name="sync_foobar")
@job(resource_defs={"airbyte": my_airbyte_resource})
def my_simple_airbyte_job():
sync_foobar()
@job(resource_defs={"airbyte": my_airbyte_resource})
def my_composed_airbyte_job():
final_foobar_state = sync_foobar(start_after=some_op())
other_op(final_foobar_state)
The Airbyte Server Address.
Port for the Airbyte Server.
Use https to connect in Airbyte Server.
Default Value: False
The maximum number of times requests to the Airbyte API should be retried before failing.
Default Value: 3
Time (in seconds) to wait between each request retry.
Default Value: 0.25
Whether to forward Airbyte logs to the compute log, can be expensive for long-running syncs.
Default Value: True
This resource allows users to programatically interface with the Airbyte REST API to launch syncs and monitor their progress. This currently implements only a subset of the functionality exposed by the API.
For a complete set of documentation on the Airbyte REST API, including expected response JSON schema, see the Airbyte API Docs.
To configure this resource, we recommend using the configured method.
Examples:
from dagster import job
from dagster_airbyte import airbyte_resource
my_airbyte_resource = airbyte_resource.configured(
{
"host": {"env": "AIRBYTE_HOST"},
"port": {"env": "AIRBYTE_PORT"},
}
)
@job(resource_defs={"airbyte":my_airbyte_resource})
def my_airbyte_job():
...
Builds a set of assets representing the tables created by an Airbyte sync operation.
connection_id (str) – The Airbyte Connection ID that this op will sync. You can retrieve this value from the “Connections” tab of a given connector in the Airbyte UI.
destination_tables (List[str]) – The names of the tables that you want to be represented in the Dagster asset graph for this sync. This will generally map to the name of the stream in Airbyte, unless a stream prefix has been specified in Airbyte.
normalization_tables (Optional[Mapping[str, List[str]]]) – If you are using Airbyte’s normalization feature, you may specify a mapping of destination table to a list of derived tables that will be created by the normalization process.
asset_key_prefix (Optional[List[str]]) – A prefix for the asset keys inside this asset. If left blank, assets will have a key of AssetKey([table_name]).
upstream_assets (Optional[Set[AssetKey]]) – A list of assets to add as sources.