Data Engineer
About the role
Why you’ll like it:
- Meaningful work: build the data foundations used by Pricing, Actuarial and analytics teams.
- Modern stack: GCP, BigQuery, dbt; scope to lift performance, quality and reliability.
- Collaborative environment with data scientists, analysts and engineering peers.
- Design, build and maintain reliable ETL/ELT pipelines into a Google Cloud data platform.
- Deliver governed, well-documented data products for pricing and actuarial use cases.
- Triage and deliver ad-hoc data requests for analytics and reporting.
- Embed robust testing/validation and uplift data quality, lineage and performance.
- Write clean, readable, well-documented code; align with team architecture and standards.
- Length: 12 months
- Location/work setup: Hybrid working based in Auckland CBD
- Team: Collaborative data engineering squad closely partnered with analytics and actuarial
- Advanced SQL and strong ETL design and implementation experience.
- Data modelling with dbt (or similar transformation frameworks such as Dataform).
- GCP experience is preferred including BigQuery; familiarity with related services (e.g., Cloud Storage, Composer/Airflow, Dataflow) open to AWS or Azure experience.
- Python for data engineering, version control and good documentation practices.
- Strong grasp of insurance products and how data flows support pricing/actuarial needs.
- Proven track record of building testable, high-quality pipelines and data products.
- Comfortable engaging with stakeholders, prioritising competing requests and working in Agile teams.
- Wellbeing Discovery Sessions
- Welcome Packs
- Hnry discount
- Exclusive Invites
- Optional Pay-As-You-Go PI/PL Insurance
Sound like you? Apply with your CV or reach out for a chat.
REF: 19296