Stambia for the Cloud

"By 2023, 75% of all databases will be on a cloud platform, reducing the DBMS vendor landscape and increasing complexity for data governance and integration."
source : Gartner, January, 2019.

Cloud architecture fundamentally changes the vision we have of Information Systems. In this architecture, enterprise data is decentralized, virtualized and outsourced!

This evolution implies a change in the way of exchanging, extracting, aggregating or simply seeing the data.

Thus, the temptation could be to choose specific ETL solutions for the Cloud. However, these solutions often have the role of simple "synchronizers": replicating data from applications or source databases to the target database in the cloud. The ability to transform data is limited. These are very often EL Cloud tools (Extract and Load) where the transformation capacity is reduced or non-existent.


Stambia chose to be fully compatible with the cloud not only in terms of architecture, but also in the "way of conceiving" interactivity with the data of an organization.

Stambia Data Integration for Cloud and Microsoft Azure

ELT architecture, the best approach for cloud projects

Why an ELT ?

The traditional way of transforming data requires the use of a proprietary engine which is not at all suitable for a cloud-like architecture.

Indeed, it is not technically optimal to extract data from the cloud, transform it on a proprietary engine, then return the data into the cloud.

This directly impacts performance, network traffic and ultimately, the overall cost of the project.

Cloud Why ETL
Cloud 100 percent

The ELT architecture takes advantage of the lack of a proprietary engine to naturally adapt to the cloud.

Data can remain in the cloud, avoiding unnecessary data flows and overloading network traffic, resulting in a significant reduction in the overall cost of the architecture.


 

Take advantage of the scalability of the cloud without having costs that "explode"

By definition, the promise of the Cloud is to have a system infinitely scalable.

The only limit for the customer: the services consumed are paid and the bill can quickly climb if there is no governance.

How to optimize costs in a natively scalable environment?
Why use the Cloud if limitations come from Middleware?

In a cloud strategy, it is important to have a data integration solution that maximizes the use of the cloud architecture: act in a light and mastered way.

AdobeStock 226197635 1920
 

Have a Hybrid data integration solution to meet all use cases

AdobeStock 226197635 1920

Why adopt a true Hybrid solution rather than a specific ETL approach for the Cloud?

The traditional information system says Legacy is always present. For many companies, it remains the engine of economic activity.
Cloud adoption is rarely done in a Big Bang mode.
It is therefore a progressive transition hence the need to have a Hybrid data integration solution.

In a few lines, the challenges are numerous:

  • It is necessary to remain powerful and to be able to manage the increasing volume of data (thanks to the ELT approach)
  • You have to be agile and know how to manage synchronization between Cloud and on premise data, while maintaining the ability to perform traditional exchanges in batch mode.
  • It is necessary to be able to manage the different sources and targets, to simply communicate between the Legacy Systems and the Cloud but also between the various Clouds: to have a multi-cloud multi-environment mapping (thanks to the universal mapping)
  • A single solution is needed to simplify the learning curve of an ever-changing eco-system and avoid maintaining different technical solutions
 

Some examples of Cloud projects made with the Stambia solution

 

How does Stambia work for the Cloud?

Stambia, another way to see the Cloud

Thanks to its Model Driven Approach to design the mapping, Stambia considerably reduces the time necessary for the realization of the projects.
The major innovation relies in its mapping the motto of which is as follows: It must be as simple to manipulate complex technologies (such as Web services, proprietary APIs - Bapis or Idocs SAP for example or the management of hierarchical files XML or JSON type) as managing flat files or simple tables.

With the traditional integration tools approach, the same type of mapping involves several steps and leads the developers to a technical and complex reasoning.

Thanks to the model driven approach, Stambia users can focus on business rules and not on technological complexity.

Stambia remains loyal to its agile development method to help you simplify and accelerate your cloud projects..

 

Mapping integration model

Stambia, an optimized price model for a natively scalable environment

Stambia offers a simple and clear pricing model.
By being based on the development effort (number of developers), it avoids the management of complex (and difficult to control) parameters such as the number of sources, the volume of manipulated data, the number of integration pipelines, etc.

Thanks to a simple pricing model, the Stambia team works with you to help you calibrate and master the different phases of your project.
Our values are the foundation of your success: delivering value and meeting our commitments.

Thus Stambia has a simple and readable price offer:

  • No cost based on CPU / Core, number of sources and / or targets
  • A cost based on the number of Designers (Stambia Designer) as well as additionnal options (functionalities) depending on the edition chosen by the customer
  • A traditional licence and maintenance model
AdobeStock 57268890 1920
 

Stambia connectivity for the Cloud

AdobeStock 57268890 1920

Stambia has developed specific components for the Cloud to improve performance and design experience.

Stambia technology is scalable: Stambia adapts natively and quickly to changes in your information system. By the fact that is is very easy to manage web services, Stambia offers immediate compatibility with all open technologies accessible via Web services / API.

Publishinf and invoking your webservices with Stambia has never been easier: just a few clicks ! Discover the Studio API, a complete and visual environment to create your own APIs.

Finally, Stambia technology makes it possible to integrate any solution very quickly. The provision of a new connector takes an average of 3 days to 3 weeks. And that development is done independently of the product roadmap.

 

logo amazon aws
Amazon Redshift
logo google cloud
Big Query
logo snowflake
Snowflake
logo microsoft azure
Microsoft Azure
 

Stambia association with Docker

Docker is the major platform for containers. Stambia offers Docker images for its Stambia Runtime runtime, as well as for the production and monitoring console "Stambia Production Analytics".

Due to its compatibility with Docker, Stambia also partners perfectly with Kubernetes, the container orchestration platform.

663 stambia plus docker 2

Technical specifications and prerequisites

SpecificationsDescription

Simple and agile architecture

  • 1. Designer : development environment
  • 2. Runtime : execution engine for data integration process, Web services, ...
  • 3. Production Analytics : consultation of the executions and putting into production
Connectivity

You can extract the data from:

  • Any relational database system like Oracle, PostgreSQL, Microsoft SQL Server (MSSQL), MariaDB, ...
  • Any NoSQL database system like MongoDB, Elasticsearch, Cassandra, ...
  • Any high performance database system like Netezza, Vertica, Teradata, Actian Vector, Sybase IQ, ...
  • Any cloud system like Amazon Web Service (AWS), Google Cloud Platform (GCP), Microsoft Azure, Snowflake, ...
  • Any ERP application like SAP, Microsoft Dynamics, ...
  • Any SAAS application like Salesforce, Snowflake, Big Query, ...
  • Any Big Data system like Spark, Hadoop, ...
  • Any MOM messaging system or ESB like Apache Active MQ, Kafka, OpenJMS, Nirvana JMS, ...
  • Any file system like CSV, XML, JSON, ...
  • Any spreadsheet system like Excel, Google Spreadsheet, ...

For more information, consult our technical documentation

 

Technical Connectivity
  • FTP, SFTP, FTPS
  • Email (SMTP)
  • LDAP, OpenLDAP
  • Kerberos

Standard features

  • Reverse: the structure of the database can be reversed thanks to the notion of meta-data reverse.
  • DDL /DML operations: Supports object and data manipulation (DML / DDL) operations such as inserting, updating, deleting, etc. (Insert, Update, Select, Delete, Create or Drop)
  • Integration strategy: Append, Incremental Update, slowly Changing Dimension, ...
  • Staging: A database can be used as an intermediate step (staging area) for transformation, reconciling data, etc. The supported modes are:
    • staging as subquery
    • staging as view
    • staging as table
  • Rejects / Data Quality: Reject rules can be defined to automaticaly filter or detect data that do not respect the conditions defined during integrations.
    • 3 types of rejections can be created: Fatal, Warning, Reject
    • Differentiated processing by data type for each rejected data
    • Recycling rejects created in previous executions
  • Replication: Database replication is supported from any source such as relational or NoSQL databases, flat files, XML / JSON files, cloud system, and so on.
Advanced features
  • Slowly Changing Dimension (SCD): lntegrations can be realized using slow-change dimension changes (SCD)
  • Loading methods:
    • Generic load
    • COPY loader
  • Change Data Capture (CDC)
  • Privacy Protectect: module for the management of privacy protection (the RGPD) with the following features
    • Anonymization
    • Pseudonimization
    • Audits
    • ...
  • Data Quality Management (DQM): Data quality management directly integrated with metadata and into the Designer
Technical prerequisites
  • Operating System:
    • Windows XP, Vista, 2008, 7, 8, 10 in 32 or 64 bits
    • Linux in 32 or 64 bits
    • Mac OS X in 64 bits
  • Memory
    • At least 1 GB of RAM
  • Disk space
    • At a minimum, there must be 300 MB of available hard disk
  • Java environment
    • JVM 1.8 or higher
  • Notes: For Linux, it is necessary to have a GTK + 2.6.0 windowing system with all dependencies
Cloud Deployment Docker image available for runtimes and operating console (Production Analytics)
Supported Standard
  • Open API Specifications (OAS)
  • Swagger 2.0
  • W3C XML
  • WSI compliant
  • SQL
Scripting language Jython, Groovy, Rhino (Javascript), ...
Source Manager Any plugin supported by Eclipse : SVN, CVS, Git, ...
Migrate from your existing data integration solution Oracle Data Integrator (ODI) *, Informatica *, Datastage *, talend, Microsoft SSIS

* possibility to migrate simply and quickly

You want to know more?

Consult our resources

Anonymisation
Ask advices to our experts in Data integration.
Contact us
Anonymisation
Discover our training and certifications
To know more about it
Anonymisation
Your customized demonstration
Get your demo