About the Position
We are looking for a Data Architect to help define and build the next generation of a data platform within a large scale modernization project. The project involves transitioning from a monolithic system to a services based architecture. You will play a key role in shaping the current state understanding, target architecture, and data roadmap, working closely with engineering, product, and business stakeholders.
This is a hands on architectural role focused on clarity, structure, and forward looking design rather than legacy maintenance. You will help establish a unified data strategy, improve efficiency, reduce infosec risks, and prepare the foundation for future ML and AI capabilities.
Responsibilities
- Document and validate the as is data architecture, including data flows, storage, pipelines, and integrations.
- Design the target data architecture, with strong consideration for modern lakehouse patterns and scalable cloud native solutions.
- Evaluate current tooling such as DBT, AWS Glue, and PostgreSQL, and recommend improvements or replatforming paths.
- Collaborate with platform engineering to define event driven data ingestion using Kafka and protobuf, and shared compute patterns.
- Contribute to the creation of a data product roadmap, including milestones, dependencies, and architectural priorities.
- Define and promote data standards, terminology, and canonical models across teams.
- Partner with engineering and business subject matter experts to ensure data architecture supports product needs and future ML and AI workflows.
- Provide architectural guidance on security, governance, and access control.
- Produce clear architectural artifacts such as diagrams, models, and documentation for technical and non technical audiences.
- Support the onboarding and transition to a future permanent Data Lead.
Requirements
- Proven experience as a Data Architect, Senior Data Engineer, or similar role in modern cloud environments.
- Strong understanding of data modeling, ETL and ELT, data governance, and distributed data systems.
- Experience with modern data platforms such as Databricks, Snowflake, or equivalent lakehouse or warehouse technologies.
- Familiarity with DBT, AWS Glue, or similar transformation and orchestration tools.
- Experience designing event driven data ingestion using Kafka and streaming pipelines is a strong plus.
- Ability to create clear architectural documentation and communicate complex concepts simply.
- Comfort working in environments undergoing modernization and organizational change.
- Collaborative mindset with the ability to work across engineering, product, and business teams.
- Interest in or exposure to ML and AI workflows, MLflow, or model driven architectures is a plus.
- Exposure to data lakehouse concepts and unified governance layers.
Nice to Have
- Experience in organizations transitioning from monolithic to microservices architectures.
- Understanding of infosec considerations in data architecture.
- Ability to influence without authority and guide teams through ambiguity.