We are looking for an experienced Lead Data Integration Specialist to join our team.
The ideal candidate has at least 5 years of experience in data management, storage, modeling, analytics, migration, and database design. As a Tech Lead/Team Lead, you will design and implement innovative data integration solutions, contribute to cloud solution architecture, and lead a high-performing team to ensure best practices are followed. Your deep expertise in cloud environments, data warehousing solutions, and data security will help drive the success of our projects.
Responsibilities
- Lead a team of engineers, providing technical leadership and mentorship
- Design and implement data integration solutions, model databases, and build scalable data platforms
- Work with classic data technologies and modern cloud or hybrid data solutions
- Contribute to cloud solution architecture and serve as a role model for the team
- Collaborate with product and engineering teams to define technical requirements and drive architectural decisions
- Build partnerships with architects and stakeholders within the organization
- Perform detailed analysis of business problems and technical environments to design high-quality solutions
- Participate in code reviews and testing to ensure adherence to best practices
- Foster a high-performance engineering culture and provide technical guidance
- Write and maintain project documentation, including technical documentation and use cases
Requirements
- 5+ years of experience in data management, storage, analytics, migration, and database design
- Proven ability to lead technical teams and mentor engineers
- Expert knowledge of cloud environments (AWS, Azure, GCP) and data warehousing solutions (Redshift, Azure Synapse Analytics, Google BigQuery, Snowflake)
- Experience with data integration tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Apache NiFi, KNIME, SSIS)
- Strong understanding of relational databases (MS SQL Server, Oracle, MySQL, PostgreSQL) and production coding experience in SQL, Python, SparkSQL, PySpark, R, Bash, Scala
- Advanced knowledge of data security, data modeling, and data integration methodologies (OLAP, OLTP, ETL, DWH, Data Lake, Delta Lake, Data Mesh)
- Experience with integration patterns, CDC methods, micro-batching, delta extracts, and housekeeping processes
- Deep understanding of data lineage, metadata management, data traceability, and the ability to create high-quality design documentation
- Compliance awareness (PI, GDPR, HIPAA) and proficiency in English for professional communication
- Experience in direct customer communication and presenting technical solutions effectively
Technologies
- Cloud Platforms (AWS, Azure, GCP): Storage, Compute, Networking, Identity & Security, Data Warehousing (Redshift, Snowflake, BigQuery, Azure Synapse)
- Data Integration Tools (Azure Data Factory, AWS Glue, GCP Dataflow, Talend, Informatica, Apache NiFi, KNIME, SSIS, etc.)
- Programming Languages: SQL, Python, SparkSQL, PySpark, R, Bash, Scala
- Relational Databases (MS SQL Server, Oracle, MySQL, PostgreSQL)
- Dataflow orchestration, replication, and data preparation tools
- Version Control Systems (Git, SVN)
- Testing: Component, Integration Testing, and Data Reconciliation