Category Data Engineering

Introduction to Apache Hadoop

Introduction to Apache Hadoop

Single Node Configuration Without Yarn Sometimes it might be a bit overwhelming to understand the role of the most common open source technologies used in big data contexts. For example, probably most of you have heard about tools such as…

First steps with Pyspark and Pycharm

First steps to program in Pyspark and Pycharm

Definitive guide to configure the Pyspark development environment in Pycharm; one of the most complete options. Spark has become the Big Data tool par excellence, helping us to process large volumes of data in a simplified, clustered and fault-tolerant way.…

Pentaho PDI Plugin for Airflow

Schedule, orchestrate and monitor your Kettle tasks with Airflow with this Pentaho plugin. At Damavis we know the importance of data processing. Extracting, cleaning, transforming, aggregating, loading or cross-referencing multiple data sources allows our clients to have Insights or Predictive…