Have: ● Familiarity with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI More ❯
Have: ● Familiarity with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI More ❯
Bristol, Avon, England, United Kingdom Hybrid/Remote Options
Aspire Personnel Ltd
in AWS cloud technologies for ETL pipeline, data warehouse and data lake design/building and data movement. AWS data and analytics services (or open-source equivalent) such as EMR, Glue, RedShift, Kinesis, Lambda, DynamoDB. What you can expect Work to agile best practices and cross-functionally with multiple teams and stakeholders. You’ll be using your technical skills More ❯
the following: Python, SQL, Java Commercial experience in client-facing projects is a plus, especially within multi-disciplinary teams Deep knowledge of database technologies: Distributed systems (e.g., Spark, Hadoop, EMR) RDBMS (e.g., SQL Server, Oracle, PostgreSQL, MySQL) NoSQL (e.g., MongoDB, Cassandra, DynamoDB, Neo4j) Solid understanding of software engineering best practices - code reviews, testing frameworks, CI/CD, and code More ❯
with data frameworks (eg, Spark, Kafka, Airflow) and ETL workflows. Proficiency with SQL and NoSQL databases, including query optimization. Experience with cloud data services (eg, AWS Redshift, S3, Glue, EMR) and CI/CD for data pipelines. Strong programming skills in Python, Java, or Scala. Excellent problem-solving and collaboration skills. Ability to thrive in a fast-paced, dynamic More ❯
Manchester, England, United Kingdom Hybrid/Remote Options
Lorien
years in a technical leadership or management role Strong technical proficiency in data modelling, data warehousing, and distributed systems Hands-on experience with cloud data services (AWS Redshift, Glue, EMR or equivalent) Solid programming skills in Python and SQL Familiarity with DevOps practices (CI/CD, Infrastructure as Code - e.g., Terraform) Excellent communication skills with both technical and non More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Lorien
years in a technical leadership or management role Strong technical proficiency in data modelling, data warehousing, and distributed systems Hands-on experience with cloud data services (AWS Redshift, Glue, EMR or equivalent) Solid programming skills in Python and SQL Familiarity with DevOps practices (CI/CD, Infrastructure as Code - e.g., Terraform) Excellent communication skills with both technical and non More ❯
governance, and observability through integrated tooling ( CloudWatch , CloudTrail , Prometheus , Grafana , and centralised observability). What you'll bring: Proven experience with AWS cloud data platforms , including EKS, EC2, S3, EMR, Lambda , and VPC design . Expertise in Infrastructure as Code (Terraform) and container orchestration (Kubernetes, Docker) , with strong automation and templating skills. Hands-on experience with CI/CD More ❯
and delivering production-grade software and data systems. Proficiency in Python, Java, or Scala - comfortable writing robust, testable, and scalable code. Deep experience with AWS (Lambda, ECS/EKS, EMR, Step Functions, S3, IAM, etc.). Strong knowledge of distributed systems and streaming/data pipelines (Kafka, Spark, Delta, Airflow, etc.). Familiarity with infrastructure-as-code (Terraform, CloudFormation More ❯
seeking to hire a Data Engineering Manager to play a key role in their data operations and business intelligence initiatives. Key Responsibilities: Design & maintain AWS BI infrastructure (Redshift, S3, EMR) using Terraform and IaC best practices. Develop CI/CD pipelines (Jenkins, GitHub Actions) to automate ETL and Power BI code deployments. Manage environments (Dev, QA, UAT) and automate … data refreshes for accuracy and consistency. Oversee data pipelines and big data workflows (EMR, Spark) for high-performance analytics. Optimize code for ETL and Power BI (DAX, data models, refresh scheduling) to enhance performance. Implement observability and logging (CloudWatch, Grafana, ELK) for proactive system monitoring. Collaborate cross-functionally with BI, Platform, and Data teams on releases and issue resolution. … Monitor performance & costs in AWS, driving optimisation and efficiency. Champion automation & innovation through new tools, frameworks, and cloud-native solutions. Key Skills: AWS Cloud: Expert in Redshift, S3, Lambda, EMR, and IaC (Terraform/CloudFormation); strong understanding of big data architecture and performance optimisation. CI/CD & Automation: Skilled in Jenkins, GitHub Actions, and Python scripting for automated ETL More ❯
site Must be SC cleared, or eligible to obtain SC-level security clearance Job description We are looking for a developer with expertise in Python, AWS Glue, step function, EMR cluster and redshift, to join our AWS Development Team. You will be serving as the functional and domain expert in the project team to ensure client expectations are met. More ❯
AI Data Engineer- 2 Positions Boston, MA Client: Movate Long term contract Onsite $60 per hour W2 A robust background in AWS services such as Lambda, Glue, S3, EMR, SNS, SQS, CloudWatch, Redshift, and Bedrock. - Strong expertise in SQL and relational databases like Oracle, MySQL, and PostgreSQL. - Familiarity with the Salesforce platform, including data models and objects. - Proficiency in More ❯
database optimization skills. Excellent problem-solving and analytical skills. Good to Have: Knowledge of Python/PySpark for big data solutions. Experience with AWS cloud services (S3, Glue, Athena, EMR, Redshift Mandatory Skills: Ab Initio (link removed) Experience: 5-8 Years (link removed) The expected compensation for this role ranges from $60,000 to $135,000 (link removed) Final More ❯
migration, metadata, and master data management. Experienced delivering data analytics solutions and supporting AI initiatives. Hands on experience with cloud data platforms, ideally AWS (S3, Kinesis, Glue, Redshift, Lambda, EMR) and/or Azure (ADF, Synapse, Fabric, AzureFunctions). Familiarity with modern data lakehouse environments such as Databricks, Snowflake, or Microsoft Fabric (highly desirable). Experience with Palantir Foundry More ❯
to shape it with us. Your role will involve: Designing and developing scalable, testable data pipelines using Python and Apache Spark Orchestrating data workflows with AWS tools like Glue, EMR Serverless, Lambda, and S3 Applying modern software engineering practices: version control, CI/CD, modular design, and automated testing Contributing to the development of a lakehouse architecture using Apache More ❯
similar role. Experience of deploying & managing cloud infrastructure for Data Solutions Hands-on experience of working with AWS services, including but not limited to EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc. Good experience in setting up reliable cloud networking that is highly secured. Experience of setting up standard cloud governance policies through IAM roles. Extensive experience More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS, IAM. Experience with DevOps tooling, particularly Terraform and CI/CD pipelines (e.g. Jenkins). A proactive, growth-oriented mindset with a passion for modern data and platform More ❯
London, England, United Kingdom Hybrid/Remote Options
Cint
move fast, stay compliant and take end-to-end responsibility for their products. Major elements of our platform include AWS (we make significant use of S3, RDS, Kinesis, EC2, EMR, ElastiCache, ElasticSearch and EKS). Elements of the platform will start to expand into GCP (Compute Engine, Cloud Storage, Google Kubernetes Engine and BigQuery). Other significant tools of More ❯
and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion … transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate … and knowledge-sharing Mandatory Skills Description: - 5+ years of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g. More ❯
and maintain robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion … transformation, and integration - Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate … and knowledge-sharing Mandatory Skills Description: - 5+ years of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g. More ❯
Track record of delivering data analytics and AI/ML-enabling solutions across complex environments. Hands-on experience with cloud data platforms , ideally AWS (S3, Kinesis, Glue, Redshift, Lambda, EMR). Experience with Azure technologies (ADF, Synapse, Fabric, Azure Functions) is also valued. Strong understanding of modern data lakehouse architectures , such as Databricks , Snowflake , or Microsoft Fabric (highly desirable More ❯
DevOps tools and techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and More ❯
DevOps tools and techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and More ❯
DevOps tools and techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and More ❯
DevOps tools and techniques. What You’ll Bring Proven experience in DevOps or a similar engineering role. Proven experience deploying and managing AWS infrastructure (EC2, S3, RDS, Lambda, Glue, EMR, VPC, IAM, Redshift, etc.). Strong background in Terraform for infrastructure as code. Expertise in CI/CD automation using GitHub Actions. Hands-on experience managing Linux environments and More ❯