Skip to content
On this page

Glossary of terms.

The glossary contains an alphabetical list of terms you may encounter in the DBConvert Streams documentation. Click on a letter to jump to the corresponding section of the alphabet.

A

Adapter.

See Source Reader

B

Binary Log (Binlog)

Binary Log (Binlog) is a collection of log files generated by a MySQL server that records information about data modifications. Typically binary logs are used for data replication and data recovery. DBConvert Streams software uses binary logs to capture all events such as INSERT, UPDATE, and DELETE and write them to the target.

C

CDC Mode

In CDC mode, DBConvert Streams ingests data from database transactional logs. By analyzing the database transaction logs, DBS captures the changes made to the database, including INSERT, UPDATE, and DELETE operations. This approach ensures accurate and comprehensive data ingestion.

Change Data Capture

Change data collection is a technology that extracts row-level events from database transaction logs generated by the database engine. Read What is Change Data Collection

DBConvert Streams platform reads MySQL Binlog and WAL PostgreSQL to extract data.

Convert Mode

In conversion mode, DBConvert Streams provides a powerful mechanism for reading data directly from the tables of the source database. By directly accessing the tables, DBConvert Streams bypasses the need for transaction logs or replication mechanisms, resulting in a streamlined and efficient data transfer process.

D

DDL statements

DDL refers to Data Definition Language, a subset of SQL statements that change the database schema structure in some way, typically by creating, deleting, or modifying schema objects such as databases, tables, and views.

Destination.

See Target

Data stream.

See Event stream

E

Event.

An event is a fundamental unit of data representing the creation, update, or deletion of information in a Source for further replication to a Target. For example, this could be a new PostgreSQL entry or updated item information in the Items table in MySQL.

Event hub

Event Hub service is used to connect Source Readers with Target Writers. Events from a Source are accumulated in Event Hub to be consumed by Target.

Event stream.

An Event Stream is made up of a series of events, much like a table in SQL environment is made up of rows. Each event is a fixed sequence of data elements corresponding to the stream type.

For example, take a system that continuously creates data. Each piece of data in this system represents an event. The continuous stream of these events is called the event stream.

A stream is a table of data in the move. Think of a never-ending table where new data appear as time goes on. A stream is such a table. One record or a row in a stream is called an event.

I

Ingestion

Ingestion is the act of retrieving data from a source.

L

Logical Replication

Logical replication is applicable for the Postgres source type. In this mode, data is replicated using Postgres Write Ahead Log (WAL) set at the logical level (starting with Postgres version 10 and higher).

Read the PostgreSQL CDC Reader section to learn how to set up WAL for logical replication.

M

Metrics

Measure for insights such as event counts from sources passed to targets and elapsed time.

R

Reader

See Source Reader

Replication

Replication involves taking data from a source and loading it into a target (data store).

S

Sink

A synonym for Target. See Target.

Source

DBConvert Streams (DBS) readers collect data from external sources such as MySQL and PostgreSQL and write the collected events to the target system of your choice.

Source Reader

DBS Source Reader continuously collects data events from a source. Ingested data is passed to Event Hub to be consumed by Target Writers.

Stream

See Event stream

T

Target

A target is any database or data store to which you want to replicate data from the source.

Target Writer.

The target writer subscribes to events in the event hub. Target Writer continuously writes consumed events (database records) to target databases.

W

Warehouse

A warehouse is a repository of data collected from various disparate sources commonly used for data analysis and reporting, such as Amazon Redshift, Google BigQuery, or Snowflake.

Write Ahead Log (WAL)

See Logical Replication

DBConvert Streams - event driven replication for databases