|Date Posted||September 11, 2019|
My Client's Data team are leading the transformation for creating a data-driven organisation to use data in every decision and customer experience to obsessively make insurance much easier and better value for customers. They are working closely with the businesses to build a new large-scale, cloud hosted, secure, and consolidated data and analytics platform to enable our users to interact with all their data from a single trusted platform. The platform leverages the economics of big data, cloud elasticity, Machine Learning, Artificial Intelligence automation, and permissioned data sharing to turn information into business insights and address business and operational challenges.
This is a key role that will be accountable for the development and operations of the Data Platform to drive maximum value from data for business users and in line with company best practices. You will work as part of a cross-functional agile delivery team, including analysts, architects, big data engineers, machine learning engineers, and testers.
You will have the opportunity to work on complex problems, implementing high performance solutions that will run on top of their cloud based big data platform.
- Work as part of the Data Engineering team to uphold and evolve common standards and best practices, collaborate to ensure that our data solutions are complimentary and not duplicative.
- Build and maintain high-performance, fault-tolerant, secure and scalable data platform to support multiple data solutions use cases.
- Interface with other technology teams to design and implement robust products, services and capabilities for the data platform making use infrastructure as code and automation.
- Build and support platforms to enable our data engineers and data scientists to build cloud based big data platform.
- Create patterns, common ways of working, and standardised guidelines for the company to ensure consistency across the organisation.
Strong experience on AWS architecture/administration in production environments.
Solid experience of network and security on cloud-based environments, specifically on AWS services such as VPCs, Security Groups, NACLs and IAM roles.
Deep understanding of CI/CD using tools like Jenkins/Bamboo/AWS Code Pipeline/AWS Code Commit and configuration management using tools like Ansible, Puppet/Chef and code repositories based on GIT.
Expertise on CloudFormation/Terraform for automated provision of infrastructure.
Experience with object oriented and functional design, coding, and testing patterns as well as experience in engineering software platforms and large-scale data infrastructures.
Experience writing production quality code in Python/ Java / Scala.
Experience of building and maintaining distributed platforms to handle high volume of data.
Strong platform-level design, architecture, implementation and troubleshooting skills.
Good understanding of Enterprise patterns and best practices applied to data engineering and data science use cases at scale.
Good understanding of AWS cloud storage and computing platform (especially S3, Athena, Redshift, Glacier, EMR, EC2, RDS).
Good understanding of DevOps/DataOps in an Agile Environment, familiarity with Jira and Confluence.
Understanding on any one of BI tools such as Tableau, Qlik, Looker.
Understanding of insurance value / supply chain.
Experience of Docker/Kubernetes would be beneficial.
Knowledge of streaming data technologies such as Kafka, AWS Kinesis and AWS Lambda.
- Great problem-solving skills, and the ability and confidence to hack their way out of tight corners.
- Ability to prioritise and meet deadlines.
- Conscientious, self-motivated, and goal orientated.
- Excellent attention to detail.
- Willingness and an enthusiastic attitude to work within existing processes / methodologies.
For more information, please contact ALEX at Jefferson Frank [Click Here to Email Your Resumé] / 0191 814 7445