Welcome Guest! Log in


Stambia Data Integration allows to work with Hadoop distributions to produce fully customized Integration Processes.

The Hadoop framework is used to process large sets of data through several specific technologies such as Hadoop Distributed File System (HDFS), Hive, HBase, Sqoop, Spark, ...

Prerequisites:
  • Stambia DI Designer S18.3.6 or higher
  • Stambia DI Runtime S17.4.6 or higher
  • Java 8 or higher

Note:

Stambia DI is a flexible and agile solution. It can be quickly adapted to your needs.

If you have any question, any feature request or any issue, do not hesitate to contact us.

This component may require a dedicated license.

Please contact the Stambia support team if you have any doubt and if you need to use it in a production environment.

 

Download

You can find below the necessary resources to work with this Component in Stambia DI

Name Description Download
Component resources You can find in download section the Component resources. Component resources download

 

All the necessary installation instructions can be found in the following getting started article.
You'll find in this article the first steps to install and prepare your environment to use the Hadoop Connector.
We strongly advise to read it carefully when starting using Hadoop with Stambia to make sure everything is set up correctly.
When the installation is finished, we suggest having a look at the dedicated articles for each technology to learn how to use and configure them.

 

Supported Hadoop technologies

You can find below an overview of what technologies can be used with the Stambia DI Hadoop connector

Technology Documentation
HDFS

Presentation article

Getting started article

Hive

Presentation article

Getting started article

HBase

Presentation article

Getting started article

Stambia HBase User Guide
Impala

Presentation article

Getting started article

Sqoop

Presentation article

Getting started article

 

 

You have no rights to post comments

Articles

Suggest a new Article!