CareersJobsData

Senior Data Engineer

Senior Data Engineer

Data

Data Engineering

São Paulo, SP

Remote

SHARE

Why join us

TRACTIAN is transforming the industrial world by empowering frontline maintenance workers to achieve more. We’ve fused cutting-edge hardware with innovative software into one powerful platform, disrupting legacy systems and delivering smarter, faster solutions for our clients.

Analytics at TRACTIAN


The Data Engineering team is responsible for building and maintaining the infrastructure that handles massive datasets flowing through TRACTIAN’s systems. This department ensures the availability, scalability, and performance of data pipelines, enabling seamless access and processing of real-time and historical data. The team’s core objective is to architect robust, fault-tolerant data systems that support everything from analytics to machine learning, ensuring that the right data is in the right place, at the right time.

What you'll do


As a Data Engineer, you will build data pipelines that enable data extraction, loading and transformation for several contexts. The goal is to have a reliable, available and trustworthy system that backbones the entire analytics pipeline. The challenges may vary from large datasets to high data throughput systems, not being reduced to a small set of techniques for data handling. You will also lead initiatives on data pipelines reliability and observability.

Responsibilities

  • Develop and maintain scalable data pipelines and ETL processes.

  • Design, implement, and optimize existing data extraction and loading processes with adequate data engineering design patterns.

  • Lead data engineering reliability and observability, increasing analytics team awareness of the data flow processes before it becomes an issue.

  • Collaborate with backend and analytics engineers in a holistic data engineering process, loading data accordingly with the technical requirements.

  • Ensure data quality and consistency across various sources by implementing data validation and cleansing techniques.

  • Work with cloud-based data warehouses and analytics platforms to manage and store large datasets.

  • Monitor and troubleshoot data pipelines to ensure reliable and timely delivery of data.

  • Document data processes, workflows, and best practices to enhance team knowledge and efficiency.

  • Create dashboards as data products as internal

Requirements

  • Bachelor degree in Data Science, Statistics, Computer Science, or a related field.

  • Advanced English

  • 2+ years of experience in Data Engineering or Analytics.

  • Highly experienced in SQL and database management systems such as PostgreSQL and Clickhouse .

  • Strong understanding of data warehousing concepts and experience with ETL tools (e.g., Airflow, dbt).

  • Strong experience with programming languages such as Python with modern data stack for data engineering (e.g. DuckDb, Polars...)

  • Experience with streaming tools (e.g. Kafka).

  • Experience with cloud-based data platforms like AWS Redshift.

  • Experience with GoLang/Rust is a plus.

  • Experience with observability tools is a plus (e.g. Datadog, Grafana)

COMPENSATION

  • • Competitive salary and stock options

  • • 30 days of paid annual leave

  • • Education and courses stipend

  • • Earn a trip anywhere in the world every 4 years

  • • R$1.035/month for meals allowance

  • • Health plan with national coverage and without coparticipation

  • • Dental Insurance: we help you with dental treatment for a better quality of life.

  • • Gympass and Sports Incentive: R$300/mo extra if you practice activities

I want to apply

If you want to build a ship, don't organize people to collect wood, assign them tasks, and give orders. Instead, teach them to long for the vast and endless sea.

Antoine Saint-Exupery