Skip to content

General information

General information

Overview ADIB is a modular integration framework developed to enable seamless data exchange between various systems such as databases, files, webservices, portals, and cloud platforms. Each module in ADIB defines:

  • Source/Destination Role: Defines whether a module provides data (source), receives data (destination), or supports both.
  • Pre/Post Processing: Optional steps executed before or after data transfer for validation, transformation, or housekeeping.
  • Webservice : Webservice modules connect ADIB to external web-based endpoints. They can act as sources, destinations, or both, depending on configuration.

Integration clients must configure appropriate step parameters to interact with each module. These parameters differ by module and determine how it connects, processes, and exchanges data.


Module Categories and Descriptions


Data File Modules

Excel / CSV

  • Role: Source and Destination
  • Description: Reads or writes structured data in .xlsx or .csv formats.
  • Use Cases: Product catalogs, pricing lists, data exports/imports.

Database Modules

MySQL

  • Role: Source and Destination
  • Description: Connects to MySQL databases for read/write operations.
  • Notes: Supports only MySQL-compatible databases.

Oracle

  • Role: Source and Destination
  • Description: Integrates with Oracle databases using Oracle SQL syntax.

MSSQL

  • Role: Source and Destination
  • Description: Connects to Microsoft SQL Server using T-SQL syntax.
  • Notes: Supports stored procedures and specific MSSQL commands.

InternalDB (SQLite)

  • Role: Source and Destination
  • Description: Embedded SQLite database for temporary or cached data storage.
  • Use Case: Offline processing, staging, or data transformation.

XML-Based Modules

XML

  • Role: Destination Only
  • Description: Writes data into XML files using defined schemas.

Template Writer

  • Role: Destination Only
  • Description: Creates XML output using templates for structured exports.
  • Use Case: XML templated output for system-to-system exchange.

External System Modules

Supplier Portal

  • Role: Source and Destination (Limited)
  • Description: Integration with supplier-defined portals for restricted synchronization (e.g., product sync).

DB2Mail

  • Role: Destination (Event-driven)
  • Description: Sends automated email notifications when products are created, updated, or deleted.
  • Use Case: Event-based alerts and synchronization triggers.

Individual Webservice Modules

Elasticsearch2Database

  • Role: Source
  • Description: Reads data from Elasticsearch and writes to a relational database.
  • Use Cases: Data migration, analytics.

CS-Elastic

  • Role: Source and Destination
  • Description: Transfers data between ContentServ and Elasticsearch.
  • Use Cases: Portal data sync, search index builds.

CS Elasticsearch CSV Export

  • Role: Source
  • Description: Exports product data from ContentServ’s Elastic index into CSV.
  • Use Cases: Large-scale export for BI, data migration.

CS Elasticsearch Transform

  • Role: Destination
  • Description: Transforms data between two Elasticsearch indices.
  • Use Cases: Adapting CS Elastic data for portal use.

Workflow Actions

  • Role: Execution / Automation Module
  • Description: Executes queued tasks, actions, or workflow steps.
  • Use Cases: Automation of integration processes.

Portal-Changes

  • Role: Source and Destination
  • Description: Tracks and writes portal data changes into Elastic index and synchronizes back to ContentServ.
  • Use Cases: Portal-side change tracking and re-synchronization.

Pre / Post Process Modules These modules are used to perform actions before (Pre) or after (Post) a main data operation.

Module Description
FTP / SFTP Transfer files before or after processing (supports SFCC).
Email Sends notification emails (post only).
SCP / RSYNC File synchronization (not commonly used).
Shell Executes shell scripts before or after a job.
Prepare File with Timestamp Renames files with timestamps.
Archive File after Import Moves processed files to an archive folder.
Replace Pattern in File Performs find/replace within files.
Delete Old Logs from Elastic Removes old Elasticsearch logs by timestamp.
Delete Files or Directories Cleans up files or folders after processing.
Azure Upload Uploads files to Microsoft Azure (similar to FTP).

Cloud Modules

CS Cloud Data Export (PIM/DAM)

  • Role: Source
  • Description: Triggers data flow export from ContentServ Cloud and downloads results as CSV.
  • Use Cases: Exporting product data or assets.

CS Cloud Data Import (PIM/DAM)

  • Role: Destination
  • Description: Imports data into ContentServ Cloud (limited functionality).
  • Limitations: Works for simple attribute imports, not sub-tables.

CS Cloud Data Model

  • Role: Source or Destination
  • Description: Exports or imports the data model between ADIB and ContentServ.

CS Cloud Configuration

  • Submodules:
    • CS Analysis Job: Retrieves ContentServ system status (e.g., object IDs).
    • CS Analysis: Analyzes system state and dependencies.
    • Install: Deploys Conigon modules from GitLab into ContentServ.

Module  Summary Table

Module Source Destination Description
Excel / CSV ✔️ ✔️ File-based data exchange
MySQL ✔️ ✔️ DB read/write
Oracle ✔️ ✔️ DB read/write
MSSQL ✔️ ✔️ DB read/write
InternalDB (SQLite) ✔️ ✔️ Local cache
XML ✔️ XML export
Template Writer ✔️ XML template output
Supplier Portal ✔️ ✔️ (limited) Supplier integration
DB2Mail ✔️ Email notifications
Elasticsearch2Database ✔️ ✔️ Elastic → DB
Webservice General ✔️ ✔️ REST/SOAP integration
SW Cust/Order ✔️ ✔️ Customer/order sync
SW Articles/Variants ✔️ ✔️ Product sync
CS-Elastic ✔️ ✔️ ContentServ ↔ Elastic
Workflow Actions ✔️ ✔️ Queue-driven automation
Portal-Changes ✔️ ✔️ Portal change tracking
Elastic Search ✔️ Write to Elastic index
CS Elastic CSV Export ✔️ Elastic → CSV export
CS Elastic Transform ✔️ ✔️ Elastic → Elastic transformation
Source2Edit Portal ✔️ Custom portal source

Licensing Model Conigon ADIB consists of multiple modules that can be licensed individually. This modular approach allows:

  • Starting small and scaling later.
  • Adding new modules or integration servers as needed.
  • Paying only for modules actually in use.