Manchester, England, United Kingdom Hybrid/Remote Options
Lorien
years in a technical leadership or management role Strong technical proficiency in data modelling, data warehousing, and distributed systems Hands-on experience with cloud data services (AWS Redshift, Glue, EMR or equivalent) Solid programming skills in Python and SQL Familiarity with DevOps practices (CI/CD, Infrastructure as Code - e.g., Terraform) Excellent communication skills with both technical and non More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Lorien
years in a technical leadership or management role Strong technical proficiency in data modelling, data warehousing, and distributed systems Hands-on experience with cloud data services (AWS Redshift, Glue, EMR or equivalent) Solid programming skills in Python and SQL Familiarity with DevOps practices (CI/CD, Infrastructure as Code - e.g., Terraform) Excellent communication skills with both technical and non More ❯
site Must be SC cleared, or eligible to obtain SC-level security clearance Job description We are looking for a developer with expertise in Python, AWS Glue, step function, EMR cluster and redshift, to join our AWS Development Team. You will be serving as the functional and domain expert in the project team to ensure client expectations are met. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS, IAM. Experience with DevOps tooling, particularly Terraform and CI/CD pipelines (e.g. Jenkins). A proactive, growth-oriented mindset with a passion for modern data and platform More ❯
London, England, United Kingdom Hybrid/Remote Options
Cint
move fast, stay compliant and take end-to-end responsibility for their products. Major elements of our platform include AWS (we make significant use of S3, RDS, Kinesis, EC2, EMR, ElastiCache, ElasticSearch and EKS). Elements of the platform will start to expand into GCP (Compute Engine, Cloud Storage, Google Kubernetes Engine and BigQuery). Other significant tools of More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid/Remote Options
Reed
projects. Required Skills & Qualifications: Demonstrable experience in building data pipelines using Spark or Pandas. Experience with major cloud providers (AWS, Azure, or Google). Familiarity with big data platforms (EMR, Databricks, or DataProc). Knowledge of data platforms such as Data Lakes, Data Warehouses, or Data Meshes. Drive for self-improvement and eagerness to learn new programming languages. Ability More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom Hybrid/Remote Options
Reed
What We’re Looking For Experience building data pipelines using Spark or Pandas . Familiarity with major cloud platforms (AWS, Azure, or GCP). Understanding of big data tools (EMR, Databricks, DataProc). Knowledge of data architectures (Data Lakes, Warehouses, Mesh). A proactive mindset with a passion for learning new technologies. Nice-to-Have Skills Automated data quality More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Performance-related bonus Private medical care And many more Role and Responsibilities Develop and maintain AWS-based data pipelines using Python, PySpark, Spark SQL, AWS Glue, Step Functions, Lambda, EMR, and Redshift. Design, implement, and optimise data architecture for scalability, performance, and security. Work closely with business and technical stakeholders to understand requirements and translate them into robust solutions. … client workshops, gather feedback, and provide technical guidance. Required Skills & Experience Strong hands-on experience in Python, PySpark, and Spark SQL. Proven expertise in AWS Glue, Step Functions, Lambda, EMR, and Redshift. Solid understanding of cloud architecture, security, and scalability best practices. Experience designing and implementing CI/CD pipelines for data workflows. Proven ability to structure solutions, solve More ❯
Performance-related bonus Private medical care And many more Role and Responsibilities Develop and maintain AWS-based data pipelines using Python, PySpark, Spark SQL, AWS Glue, Step Functions, Lambda, EMR, and Redshift. Design, implement, and optimise data architecture for scalability, performance, and security. Work closely with business and technical stakeholders to understand requirements and translate them into robust solutions. … client workshops, gather feedback, and provide technical guidance. Required Skills & Experience Strong hands-on experience in Python, PySpark, and Spark SQL. Proven expertise in AWS Glue, Step Functions, Lambda, EMR, and Redshift. Solid understanding of cloud architecture, security, and scalability best practices. Experience designing and implementing CI/CD pipelines for data workflows. Proven ability to structure solutions, solve More ❯