London, South East, England, United Kingdom Hybrid/Remote Options
Involved Solutions
business decisions. Responsibilities for the AWS Data Engineer: Design, build and maintain scalable data pipelines and architectures within the AWS ecosystem Leverage services such as AWS Glue, Lambda, Redshift, EMR and S3 to support data ingestion, transformation and storage Work closely with data analysts, architects and business stakeholders to translate requirements into robust technical solutions Implement and optimise ETL More ❯
Have: ● Familiarity with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI More ❯
Have: ● Familiarity with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI More ❯
the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing frameworks, CI/CD, and code More ❯
data engineering processes. Design effective data solutions that meet complex business needs and support informed decision-making. Experience Required: Strong AWS expertise, including tools such as Glue, Lambda, Kinesis, EMR, Athena, DynamoDB, CloudWatch, SNS and Step Functions. Skilled in modern programming, particularly Python, Java, Scala and PySpark. Solid knowledge of data storage and big data technologies, including data warehouses More ❯
data engineering processes. Design effective data solutions that meet complex business needs and support informed decision-making. Experience Required: Strong AWS expertise, including tools such as Glue, Lambda, Kinesis, EMR, Athena, DynamoDB, CloudWatch, SNS and Step Functions. Skilled in modern programming, particularly Python, Java, Scala and PySpark. Solid knowledge of data storage and big data technologies, including data warehouses More ❯
with data frameworks (eg, Spark, Kafka, Airflow) and ETL workflows. Proficiency with SQL and NoSQL databases, including query optimization. Experience with cloud data services (eg, AWS Redshift, S3, Glue, EMR) and CI/CD for data pipelines. Strong programming skills in Python, Java, or Scala. Excellent problem-solving and collaboration skills. Ability to thrive in a fast-paced, dynamic More ❯
Manchester, England, United Kingdom Hybrid/Remote Options
Lorien
years in a technical leadership or management role Strong technical proficiency in data modelling, data warehousing, and distributed systems Hands-on experience with cloud data services (AWS Redshift, Glue, EMR or equivalent) Solid programming skills in Python and SQL Familiarity with DevOps practices (CI/CD, Infrastructure as Code - e.g., Terraform) Excellent communication skills with both technical and non More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Lorien
years in a technical leadership or management role Strong technical proficiency in data modelling, data warehousing, and distributed systems Hands-on experience with cloud data services (AWS Redshift, Glue, EMR or equivalent) Solid programming skills in Python and SQL Familiarity with DevOps practices (CI/CD, Infrastructure as Code - e.g., Terraform) Excellent communication skills with both technical and non More ❯
governance, and observability through integrated tooling ( CloudWatch , CloudTrail , Prometheus , Grafana , and centralised observability). What you'll bring: Proven experience with AWS cloud data platforms , including EKS, EC2, S3, EMR, Lambda , and VPC design . Expertise in Infrastructure as Code (Terraform) and container orchestration (Kubernetes, Docker) , with strong automation and templating skills. Hands-on experience with CI/CD More ❯
governance, and observability through integrated tooling ( CloudWatch , CloudTrail , Prometheus , Grafana , and centralised observability). What you'll bring: Proven experience with AWS cloud data platforms , including EKS, EC2, S3, EMR, Lambda , and VPC design . Expertise in Infrastructure as Code (Terraform) and container orchestration (Kubernetes, Docker) , with strong automation and templating skills. Hands-on experience with CI/CD More ❯
seeking to hire a Data Engineering Manager to play a key role in their data operations and business intelligence initiatives. Key Responsibilities: Design & maintain AWS BI infrastructure (Redshift, S3, EMR) using Terraform and IaC best practices. Develop CI/CD pipelines (Jenkins, GitHub Actions) to automate ETL and Power BI code deployments. Manage environments (Dev, QA, UAT) and automate … data refreshes for accuracy and consistency. Oversee data pipelines and big data workflows (EMR, Spark) for high-performance analytics. Optimize code for ETL and Power BI (DAX, data models, refresh scheduling) to enhance performance. Implement observability and logging (CloudWatch, Grafana, ELK) for proactive system monitoring. Collaborate cross-functionally with BI, Platform, and Data teams on releases and issue resolution. … Monitor performance & costs in AWS, driving optimisation and efficiency. Champion automation & innovation through new tools, frameworks, and cloud-native solutions. Key Skills: AWS Cloud: Expert in Redshift, S3, Lambda, EMR, and IaC (Terraform/CloudFormation); strong understanding of big data architecture and performance optimisation. CI/CD & Automation: Skilled in Jenkins, GitHub Actions, and Python scripting for automated ETL More ❯
site Must be SC cleared, or eligible to obtain SC-level security clearance Job description We are looking for a developer with expertise in Python, AWS Glue, step function, EMR cluster and redshift, to join our AWS Development Team. You will be serving as the functional and domain expert in the project team to ensure client expectations are met. More ❯
to shape it with us. Your role will involve: Designing and developing scalable, testable data pipelines using Python and Apache Spark Orchestrating data workflows with AWS tools like Glue, EMR Serverless, Lambda, and S3 Applying modern software engineering practices: version control, CI/CD, modular design, and automated testing Contributing to the development of a lakehouse architecture using Apache More ❯
similar role. Experience of deploying & managing cloud infrastructure for Data Solutions Hands-on experience of working with AWS services, including but not limited to EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc. Good experience in setting up reliable cloud networking that is highly secured. Experience of setting up standard cloud governance policies through IAM roles. Extensive experience More ❯
London, England, United Kingdom Hybrid/Remote Options
Cint
move fast, stay compliant and take end-to-end responsibility for their products. Major elements of our platform include AWS (we make significant use of S3, RDS, Kinesis, EC2, EMR, ElastiCache, ElasticSearch and EKS). Elements of the platform will start to expand into GCP (Compute Engine, Cloud Storage, Google Kubernetes Engine and BigQuery). Other significant tools of More ❯
Track record of delivering data analytics and AI/ML-enabling solutions across complex environments. Hands-on experience with cloud data platforms , ideally AWS (S3, Kinesis, Glue, Redshift, Lambda, EMR). Experience with Azure technologies (ADF, Synapse, Fabric, Azure Functions) is also valued. Strong understanding of modern data lakehouse architectures , such as Databricks , Snowflake , or Microsoft Fabric (highly desirable More ❯
DevOps tools and techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and More ❯
DevOps tools and techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and More ❯
DevOps tools and techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and More ❯
DevOps tools and techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and More ❯
change) and driven to achieve a full end to end continuous deployment pipeline. Major elements of our platform include AWS (we make significant use of S3, RDS, Kinesis, EC2, EMR, ElastiCache, ElasticSearch and EKS). Elements of the platform will start to expand into GCP (Compute Engine, Cloud Storage, Google Kubernetes Engine and BigQuery). Other significant tools of More ❯
data management. Experience in developing data analytics and AI-driven solutions. Skilled in designing and automating data quality metrics and KPIs. Proficient with AWS (S3, Kinesis, Glue, Redshift, Lambda, EMR) and/or Azure (ADF, Synapse, Fabric, Functions). To be Considered.... Please either apply by clicking online or emailing me directly to henry.clay-davies@searchability.com For further information More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid/Remote Options
Reed
projects. Required Skills & Qualifications: Demonstrable experience in building data pipelines using Spark or Pandas. Experience with major cloud providers (AWS, Azure, or Google). Familiarity with big data platforms (EMR, Databricks, or DataProc). Knowledge of data platforms such as Data Lakes, Data Warehouses, or Data Meshes. Drive for self-improvement and eagerness to learn new programming languages. Ability More ❯