Principal Data Engineer – AWS

Principal AWS Engineer

As a Lead AWS Data Engineer within the Data Team, you will be responsible for the implementation, and growth of the AWS cloud infrastructure for my client. You will work with a team of architects and engineers to design, build, and maintain scalable, secure, and reliable AWS solutions. You will also be responsible for providing technical leadership and mentorship to team members.

Responsibilities
* Leading member of a team of engineers in the design, implementation, and maintenance of AWS based cloud solutions for Data & Analytics
* Work with architects and other stakeholders to understand business requirements and lead their respective implementation
* Design and implement security and compliance measures for AWS cloud environments
* Develop and implement DevOps practices to automate and streamline the software development and deployment process
* Mentor and train junior engineers on AWS technologies and best practices
* Stay up-to-date on the latest AWS technologies and trends

Qualifications
* 5+ years of experience in AWS (preference to certified professionals as Cloud Data Engineer, Software Development Engineer or Application Architect at associate level or above)
* Strong understanding of AWS core services, including EC2, S3, RDS, Glue Data Quality, Glue Studio, Glue DataBrew, Kinesis, SNS, CloudWatch, EventBridge, Step Functions, Macie and VPC
* Experience with AWS security and compliance best practices
* Experience with DevOps practices and tools
* Experience with Data Solutions for Insights & Analytics and Lakehouse architectures
* Experience in mentoring and training other engineers
* Excellent communication and interpersonal skills

Desired Skills
* Experience with AWS serverless computing services, such as Lambda, Glue (with all its suite of services), and Athena
* Experience with AWS code repositories and CI/CD methods namely through the use of AWS CodeCommit, CodePipeline and CodeBuild
* Experience in developing modern data pipelines in the context of an AWS based data lakehouse architecture, with embedded data quality, complex data transformations and validations
* Experience with creating, reading and handling different file formats like JSON, YAML and Parquet in the context of Infrastructure as a Code and Data Lake architectures.
* Proficient in AWS SDK, REST-based API’s and use of containerisation (through ECS or EKS)

This role has been deemed outside IR35 and has a particular focus around knowledge transfer / upskilling of the internal team so will require 1-2 days per week in Bristol.

Upload your CV/resume or any other relevant file. Max. file size: 2 MB.
I consent to storing and processing my personal data as outlined on the 'How Cadence Resourcing manages and uses your personal data' page.


You can apply to this job and others using your online resume. Click the link below to submit your online resume and email your application to this employer.