Data Architect

Champion the end to end build of a "Big Data" solution by utilising best in industry tools and technologies.
  • Build and optimize "Big Data' pipelines, architecture and data sets
  • Consult with valued clients on best practice data services engineering
  • Market leader in Cloud, Security, Digital and Infrastructure 

Company:
Join a dynamic company that is committed to continuous innovation and an uncompromising commitment to service excellence. With high value placed on personal development, they offer a variety of training and certifications to ensure continued growth within your respected field.

The Opportunity:
The purpose of this role is to provide Data platform architecture and data engineering expertise through customer consulting engagements, and internal product development projects. You will be working with a large government client to build a Big Data software stack, specially using Hadoop and Apache Spark frameworks.

Play a vital role in assisting customers solve complex business transformation problems, whilst leading data services product development initiatives. You will get the opportunity to lead proof of concept and engagements involving cloud and big data technology.

You will be joining an experienced and high functioning team of technology experts, who are passionate about sharing knowledge collaboratively to ensure innovation around best practice is met. To be successful in this role you will be a senior Data Architect, with in depth experience creating large, complex data sets that meet functional / non-functional business requirements.

Must have experience:
  • General infrastructure and high-performance compute expertise e.g. server, storage (object, block, file), virtualisation (VMware).
  • Experience with big data and Linux security features e.g. Sentry, Navigator.
  • Strong analytical skills related to working with unstructured datasets.
  • Exposure to DevOps and CI/CD process and tooling e.g. Jenkins, Git.
  • Experience with object-orients/object function scripting and orchestration languages e.g. Python, Java, C++, Scala, Chef, Puppet or Ansible.
  • Strong understanding of Azure, AWS and/or GCP service configuration and implementation especially data analytics and storage e.g. blob, HDInsight, S3, Redshift, EMR, big query etc.
  • At least 5 years working in a similar role where you provided consulting and data solution architecture advise.

This is a fantastic opportunity for the right person. If you're looking to join a top-class organisation working on a large-scale big data programme, apply now to be considered for this role.

Job reference: JO-1902-49767


Apply now