Tutorial DataHub 4 – API
In this new part of the DataHub tutorial saga, we are going to work on the connectivity with the platform through the API. As data engineers, the goal is to incorporate DataHub as a Data Governance tool in our ecosystem.…
In this new part of the DataHub tutorial saga, we are going to work on the connectivity with the platform through the API. As data engineers, the goal is to incorporate DataHub as a Data Governance tool in our ecosystem.…
The arrival of Apache Spark 4.0 is a long time coming, but to encourage the community, the famous foundation has released a preview access of the version. A few months ago, the Databricks developers gave us a small preview of…
In this post, we will talk about the main concepts of DataHub at a functional level and we will study the fundamental elements by taking a tour of the application. To be able to follow it, you can use the DataHub…
In the Tutorial DataHub I we analysed the architecture of this platform. In this post, we are going to see a guide on how to deploy DataHub and start working with this tool. DataHub can be deployed in two ways:…
We begin a collection of tutorials on the use and operation of DataHub, a Data Governance platform that we already mentioned in the post What is a Data Catalog and what does it consist of. In this series, we explore…
What is a Data Catalogue seems very intuitive and that anyone minimally initiated in this world would understand. But putting it into practice and implementing it is a bit more complicated. For those who are not familiar with this concept,…
In the world of application development, container-based deployment environments are becoming more and more common, and Kubernetes has established itself as the standard for container-based deployment. However, for many developers, setting up and managing a complete Kubernetes cluster can be…
In the field of Data Engineering, efficient database design is essential to handle large volumes of data and provide effective analysis. Throughout my experience as a Data Engineer, I have worked with the main data relationship systems and have observed…
Today we are going to talk about two ways of testing in Apache Airflow. Historically, testing in Airflow has been something that has been a headache for all users of the famous framework. The coupling of the code with the…
In this post we are going to talk about how DBT integrates with Spark and how this integration can be useful for us. DBT is a framework that facilitates the design of data modeling throughout the different data modeling cycles.…