Stambia ELT Component for Teradata

Teradata is a widely used, fully scalable database used to manage large volumes of data. The Teradata Vantage platform offers variety of data and analytics solutions with hybrid cloud offerings.

Stambia component for Teradata is built to compliment the robust capabilities Teradata has to offer and simplify the Integration in the Analytics project using Teradata solutions.

 
Discover how Stambia's unified approach makes it easy via GUI for Teradata

Data Integration (ETL) processes with Teradata: 6 challenges


Agile Data Integration for Teradata is needed

In any Analytics project, it is important to choose solutions that improves your agility. A lot of solutions out there can become quite complex in a longer run and as a result a lot of time and effort goes to manage these tools.

Teradata with its Vantage platform, focuses on providing the best answers and offers you hybrid cloud products that simplifies your analytics journey.

The Integration on the other hand should offer the same simplicity, flexibility and ease of use so that you focus your time on implementations and data requirements and be more agile in your projects.

Agile data integration with Teradata
 

Manage Traditional Datawarehouse and Big Data with the same skills

Datawarehouse and big data with Teradata

With changing data landscapes, we see more and more customers make use of various kinds of technologies and architectures to fulfill different types of requirement.

As a result, a lot of IT teams are dealing with new types of data formats, applications and ecosystems. To be able to integrate and exchange data between Teradata and Hadoop technologies, or Integrating Teradata with Spark, is really an essential feature for a Data Integration tool.

These features should be part of the same solution (same design, same architecture) to make life easier for the IT teams to be consistently working different types of projects.

 

Deploy Hybrid data integration On-Premise or On-Cloud

Cloud continues to become mainstream within most organizations and an adoption of "as-a-service" models is getting more and more popular. In fact, a lot of organizations are now considering, and some have already moved to a multi-cloud architecture.

Although, there are organizations that still prefer to have some data On-premise, resulting in adopting a Hybrid architecture. Hence, it is very essential for the IT teams to be able to support Integration needs from all these standpoints.

A key point: owning a solution that supports Hybrid and Multi-cloud implementations, for e.g. integration projects where you have an on-premise Teradata instance and (or) Teradata Vantage on AWS or Azure as well as GCP for some other need

Hybrid cloud data integration with Teradata
 

Getting the best Performance for batch and real-time Teradata integration

Performance for data integration with Teradata

When you own a Teradata solution, using an Integration tool that provides a proprietary and external engine to process and transform the data is totally not necessary.

With some very robust capabilities to Ingest, Analyze and Manage the data, Teradata checks all the boxes in terms of Integration (or ETL).

An Integration solution should leverage on these capabilities to improve the global performances of your Analytics projects. The solution should also automate the Integration to provide agility and flexibility in your development, and provide the ability to manage batch data integration for large amounts of data, as well as real-time data ingestion for immediate analysis of your business.

The ELT approach is the best way to optimize your investment and get the best performances.
 

Use a solution that is customizable for Teradata Vantage

In the age where Machine Learning algorithms are the new norm, the ability to provide then the right data is very important. Insufficient or bad quality data can impact the results of these algorithms.

On the other hand, Teradata platform is constantly evolving and constantly adding new and improved features to give users the best out of the solution. A rigid solution has no place in the current technological lanscape.

Hence the integrators have to be highly customizable and be ready to absorb the technological changes and new data requirements.

Flexible data integration with Teradata
 

Master and Optimize the cost of integration with Teradata Vantage

Master data integration costs with Teradata

Cost Optimization is a one of the critical topics for most of the CIOs for any new or existing data and analytics project or initiative. The complexity to manage the cost, at every stage of the project, depends on the kind of tool sets used.

When talking about integration, it starts with buying software licenses, hardware to support it, human resource to design and implement it and long term maintenance of the projects. Evaluation of the Data Integration solution should focus both, on it's technical capabilities as well as the overall expenses you incur using it.

These expense should not just be the up front cost, but a well thought out expenditure, for the next few years is what is important to realize, before moving forward

How Stambia works with Teradata Vantage?

 

Stambia Integration with Teradata and parallel transporter

The Stambia Component for Teradata, is the best way to simplify the extraction or integration of data with the Teradata MPP system, providing increased productivity with an easy to use graphical solution.



 

Native reverse engineering of Teradata Database structures

Stambia component for Teradata does a reverse engineering of all the information about the database that becomes really benifitial in the designs for optimization and automation.

These would be standard information such as schemas, tables, columns, datatypes etc. as well as some specific ones like Primary indexes, UPI etc.

On the other hand, this metadata can be customized and adapted based on specific optimization to achieve specific objective.

Reverse engineering and metadata with Teradata
 

Stambia Business oriented designs: Simplify and be more productive with Teradata

Data mapping on Teradata with Stambia

The Model Driven Approach of Stambia (through the templates) is used to adapt the connector to all types of projects.

Simply connect to various technologies and file formats and design a very simple mapping to extract from the source, load to Teradata and transform inside Teradata. The Universal Data Mapper in Stambia lets users focus on business rules and work on a very simple and high level design.

Customers using Teradata and Stambia testify that the turnaround time for them to implement any new data project is much shorter when compared to what was done previously.



 

Industrialization and Automation through Stambia Templates

As an ELT solution Stambia uses the native utilities to ingest, analyze and transform data. This works best in terms of the Performance needed to process large sets of data. This also reduces significantly, the need for dedicated ETL server and the tool comes with light footprint.

On the other hand, Stambia's Model Driven Approach automates a lot of redundant steps that are done in some of the other Tools. Some of the benefits of this approach are:

  • Ease in managing large teams in a big project due to a consistent and automated integration solution
  • Guarantee of the the expected performance as a result of an ELT approach
  • Flexibility in adapting to various optimization or cusomization needs
Industrialization with Teradata
 

Take benefit of embedded Teradata Optimizations

Teradata optimizations with Stambia

Stambia uses specific methods adapted to Teradata in order to integrate or extract the data.

Load or export can be done using tools such as Teradata Parallel Transporter, Fastload, Multiload, Fastexport and other utilities provided by Teradata.

The incremental mode of integrating the data proposes different methods such as "insert and update", renaming table methods, "delete and insert" methods or "merge" operations.

Primary indexes and statistics are detected and used to improve the performance.

"Query band" can be used to track the SQL orders generated by Stambia and provide a way to optimize and master the processes.

 

Transform Teradata BTEQ / SQL script to Stambia graphical mappings

 

Most of the ETL tool on market fail to work inside Teradata. As a result a lot of data transformation has to be done through BTEQ.
Users then write BTEQ scripts in order to fulfil data transformation needs.

Let's try something new with Stambia component for Teradata Vantage and discover how the tool entirely AUTOMATES the migration of BTEQ scripts to produce STAMBIA Mappings & Processes..

Technical specifications and prerequisites

SpecificationsDescription

Simple and agile architecture

  • 1. Designer: data integration development studio
  • 2. Runtime: engine that executes data integration processes
  • 3. Production Analytics: management of production (logs, delivery)

Protocols

JDBC, HTTP, HTTPS

Storage

Based on the architecture, below storages can be used:

  • HDFS
  • Azure Blob Sorage
  • Amazon S3
  • Google Cloud Storage
Connectivity

You can extract data from:

  • Any relational database system such as Oracle, PostgreSQL, MSSQL, ...
  • Any NoSQL database system such as MongoDB, Elasticsearch, Cassandra, ...
  • Any High performance database system such as Vertica, Teradata, Netezza, Action Vector, Sybase IQ, ...
  • Any Cloud system such as Amazon Web Service (AWS), Google Cloud Platform (GCP), Microsoft Azure, ...
  • Any ERP applications such as SAP, Microsoft Dynamics, ...
  • Any SAAS applications such as Salesforce, Snowflake, Big Query, ...
  • Any Big Data system such as Spark, Hadoop, ...
  • Any Messaging MOM or ESB such as JMS, Apache Active MQ, Kafka, ...
  • Any file type such as XML, JSON, ...
  • Any spreadsheet such as Excel, Google Spreadsheet, ...

For more information, consult the technical documentation

 

Technical connectivity
  • FTP, SFTP, FTPS
  • Email (SMTP)
  • LDAP, OpenLDAP
  • Kerberos

Structured and semi-structured

XML, JSON, Avro

Data Integration - Standard features

  • Reverse : The database structure can be reversed in a dedicated Metadata
  • DDL /DML Operations : DML/DDL operations can be performed on the database, such as: Insert, Update, Select, Delete, Create or Drop
  • Integration methods : Append, Incremental Update
  • Staging : The databases can be used as a staging area to make data transformation, mutualization, ... The following modes are supported:
    • staging as subquery
    • staging as view
    • staging as table
  • Reject : Reject rules can be defined to filter or detect data that does not fulfill the rules during integrations
    • Three type of rules can be created: Fatal, Warning, Reject
    • Depending on the specified type, the rejected data will not be handled in the same way
    • Rejects of previous executions can also be automatically recycled
  • Replication : Replication of databases is supported. The replication source can be any rdbms, Flat files or XML files, Cloud, ...
Data Integration - Advanced features
  • Slowly Changing Dimension Integrations : Integrations can be performed using Slowly Changing Dimension (SCD)
  • Complete support for Teradata Utilities like TPT with configuration to perform specific loads like MLOAD, FastLoad etc.
  • Optimization features like selecting SQL Query Band for specifc steps.
  • Different types of Integration automated, for eg. Slowly Changing DImension Type II, Cancel / Replace, Incremental, Merge etc.
  • Change Data Capture (CDC)
  • Privacy Protection : GDPR capabilities
    • Anonymization
    • Pseudonimization
    • Audits
    • ...
  • Data Quality Management (DQM) : Data Quality Management capabilities added to the metadata and the Stambia Designer
Requirements
  • Operating System:
    • Windows XP, Vista, 2008, 7, 8 or 10 / Both 32-bits and 64-bits are supported
    • Linux / Both 32-bits and 64-bits are supported
    • Mac OS X / Only 64-bits is supported
  • Memory
    • At least 1GB of RAM
  • Disk Space
    • The system must have at least 300 Mo of free disk space
  • Java Virtual Machine
    • JVM version 1.8 or higher
  • Miscellaneous : Linux Windowing System: GTK+ 2.6.0 and all its dependencies is required on Linux environments
Cloud Deployment Docker image available for Runtime and Production Analytics components
Standard supported
  • Open API Specifications (OAS)
  • Swagger 2.0
  • W3C XML
  • WSI compliant
  • SQL
Scripting language Jython, Groovy, Rhino (Javascript), ...
Sources versionning system

Any plugin supported by Eclipse (CVS, SVN, git, …)

Migrating from Oracle Data Integrator (ODI) *, Informatica *, Datastage *, talend, Microsoft SSIS

* capabilities to Migrate seamlessly

Want to know more ?

Consult our resources

Anonymisation
Ask advices to our experts in Data integration.
Contact us
 
Vidéo : Teradata BTEQ / SQL migration tool
Open Video
Anonymisation
Your customized demonstration
Get your demo
Anonymisation
Discover our training and certifications
To know more about it