The
Data Usage team is responsible for building and maintaining
data pipelines and a centralized
Data Warehouse (DWH) at inDrive, based on
Google Cloud Platform.
Our mission is to ensure:
- correctness and completeness of data,
- compliance of the DWH with audit requirements,
- transparency and trust in data for business and finance stakeholders.
We are looking for a
Middle Analytics Engineer with hands-on experience working with data warehouses, who understands DWH architecture, incremental data processing, and the responsibility that comes with financial data.
- Gather and clarify requirements from analytics and finance stakeholders
- Design and evolve DWH architecture (ODS / Data Marts) with audit requirements in mind
- Develop and maintain financial data pipelines
- Design, implement, and support incremental data pipelines
- Develop and optimize data marts in BigQuery (Dataform)
- Analyze source systems and build data flows from source to consumption
- Maintain and develop workflows in Airflow
- Participate in testing, data validation, and release processes
- Investigate data quality issues and financial discrepancies
- Maintain technical documentation required for audit and data governance
- 2–3+ years of experience as an Analytics Engineer / DWH Engineer / Data Analyst working with DWH
- Hands-on experience with data warehouses
- Strong understanding of DWH architecture and data layers (ODS, Data Marts)
- Understanding of incremental loads, historical data handling, and deduplication
- Strong SQL skills
- Experience designing and optimizing data marts
- Experience with BigQuery (partitioning, clustering, cost-aware querying)
- Experience with Airflow or similar orchestration tools
- Python for data processing and ETL tasks
- Ability to work with stakeholders and translate business needs into data requirements
Must have to be familiar with:
•
Languages: SQL (strong knowledge), Python (basic knowledge)
•
Orchestration: Airflow or similar orchestration tool
•
Version Control: Git
• Experience with Data Quality / Data Governance / SLAs
Nice but not mandatory to be familiar with:- Cloud & Storage: Google Cloud Platform (BigQuery, Dataform, Cloud Storage)
- Streaming: Kafka-based data delivery pipelines (streaming, schemas, delivery guarantees)
- Processing frameworks: Spark, Apache Beam
- Analytics & BI: Looker, Tableau
- Experience working with financial data or financial reporting
- Experience preparing data for audits or regulatory requirements
- Modern cloud technical stack
- Real impact on data architecture and processes
- Flexible work schedule and format
- Health insurance
- Learning and professional development support
- Access to psychological, financial, and legal support programs