Life Cycle, agile methodologies such as CI/CD, Application Resiliency, and Security Cloud implementation experience with AWS including: AWS Data Services: Glue ETL (or) EMR, S3, Glue Catalog, Athena, Lambda + Step Functions + Event Bridge, ECS Data De/Serialization: Parquet and JSON format AWS Data Security: Good Understanding of security concepts such as: IAM, Service roles More ❯
Security In-depth knowledge of the financial services industry and their IT systems Cloud implementation experience with AWS including: AWS Data Services: Glue ETL (or) EMR, S3, Glue Catalog, Athena, Lambda + Step Functions + Event Bridge, ECS Data De/Serialization: Parquet and JSON format AWS Data Security: Good Understanding of security concepts such as: IAM, Service roles More ❯
solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Leverage AWS services (e.g., S3, EC2, Athena, Lambda, Glue) to build scalable and secure cloud solutions. Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code More ❯
Penryn, England, United Kingdom Hybrid / WFH Options
Aspia Space
data engineering, data architecture, or similar roles. •Expert proficiency in Python, including popular data libraries (Pandas, PySpark, NumPy, etc.). •Strong experience with AWS services—specifically S3, Redshift, Glue (Athena a plus). •Solid understanding of applied statistics. •Hands-on experience with large-scale datasets and distributed systems. •Experience working across hybrid environments: on-premise HPCs and cloud platforms. More ❯
will be working on complex data problems in a challenging and fun environment, using some of the latest Big Data open-source technologies like Apache Spark, as well as Amazon Web Service technologies including Elastic MapReduce, Athena and Lambda to develop scalable data solutions. Key Responsibilities: Adhering to Company Policies and Procedures with respect to Security, Quality and More ❯
and customise them for different use cases. Develop data models and Data Lake designs around stated use cases to capture KPIs and data transformations. Identify relevant AWS services - on Amazon EMR, Redshift, Athena, Glue, Lambda, to design an architecture that can support client workloads/use-cases; evaluate pros/cons among the identified options to arrive at More ❯
Collaborate with development teams to design and implement automated tests for microservices, emphasizing Spring Boot and Java-based architectures. Implement testing strategies for AWS data lakes (e.g., S3, Glue, Athena) with a focus on schema evolution, data quality rules, and performance benchmarks, prioritizing data lake testing over traditional SQL approaches. Automate data tests within CI/CD workflows to … maintain scalable test automation frameworks, with a focus on backend, API, and data systems using tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures More ❯
Collaborate with development teams to design and implement automated tests for microservices, emphasizing Spring Boot and Java-based architectures. Implement testing strategies for AWS data lakes (e.g., S3, Glue, Athena) with a focus on schema evolution, data quality rules, and performance benchmarks, prioritizing data lake testing over traditional SQL approaches. Automate data tests within CI/CD workflows to … maintain scalable test automation frameworks, with a focus on backend, API, and data systems using tools like Pytest and Postman. Expertise in Pandas, SQL, and AWS analytics services (Glue, Athena, Redshift) for data profiling, transformation, and validation within data lakes. Solid experience with AWS (S3, Lambda, EMR, ECS/EKS, CloudFormation/Terraform) and understanding of cloud-native architectures More ❯
organizational levels. Analytical, organizational, and problem-solving skills. Experience with data observability tools like Grafana, Splunk, AWS CloudWatch, Kibana, etc. Knowledge of container technologies such as Docker, Kubernetes, and Amazon EKS. Education Requirements: Bachelor’s Degree in Computer Science, Engineering, or related field, or at least 8 years of equivalent work experience. 8+ years of IT data/system More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
and maintaining robust, cloud-native data pipelines Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data … ve got solid experience working with Python, SQL, Spar and data pipeline tools such as dbt or Airflow You’re comfortable working across cloud platforms – especially AWS (Glue, Lambda, Athena), Azure (Data Factory, Synapse), or GCP (Dataflow, BigQuery, Cloud Composer) You have a good understanding of data modelling, data warehousing and performance optimisation You care deeply about data quality More ❯
experience. Excellent SQL experience across various platforms (SQL, PostgreSQL, PL/SQL, etc.). Experience with several of the following database technologies: MySQL, Oracle, SQL Server, Postgres, RDS, Aurora, Athena, MongoDB, or similar large-scale databases. CVs should be sent to Nick ASAP for immediate review. Required Skills: Agile, CVS, React, AWS, Data Migration, PL/SQL, DevOps, Timelines More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
data warehouse. These services provide the "business-ready" data that powers our products. We work in an Agile environment using modern tools and technologies including AWS, Glue, Step Functions, Athena, PySpark, SQL, and Python. Our processes are metadata-driven to ensure scalable, consistent, and reliable data delivery. Your Role You’ll work closely with the analytics team to design More ❯
Exeter, England, United Kingdom Hybrid / WFH Options
jobs24.co.uk
variety of projects in the cloud (AWS, Azure, GCP), while also gaining opportunities to learn about and use data services such as Databricks, Data Factory, Synapse, Kafka, Redshift, Glue, Athena, BigQuery, S3, Cloud Data Fusion etc. Responsibilities You're an engineer at heart and enjoy the challenge of building reliable, efficient data applications systems, services, and platforms. You have More ❯
Airflow. Experience with low/no-code pipeline development tools such as Talend or SnapLogic. Experience developing data pipelines using cloud services (AWS preferred) like Lambda, S3, Redshift, Glue, Athena, Secrets Manager or equivalent services. Experience of working with APIs for data extraction and interacting with cloud resources via APIs/CLIs/SDKs (e.g. boto3). Experience building More ❯
innovation and debate at the heart of University life. We are committed to proactively addressing the barriers experienced by some groups in our community and are proud to hold Athena SWAN, Race Equality Charter and Disability Confident accreditations. We have an Equality Diversity and Inclusion Centre that focuses on continuously improving the University as a fair and inclusive place More ❯
Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
Senior Data Engineer - Data Infrastructure and Architecture: C-4 Analytics C-4 Analytics is a fast-growing, private, full-service digital marketing company that excels at helping automotive dealerships increase sales, increase market share, and lower cost per acquisition. We More ❯
Summary Yelp engineering culture is driven by our values : we're a cooperative team that values individual authenticity and encourages creative solutions to problems. All new engineers deploy working code their first week, and we strive to broaden individual impact More ❯
Software Development Engineer, Spektr, Advertising Core Services Would you like to build highly available, scalable and distributed engineering systems for one of the largest data lakes in Amazon? Does Petabyte scale excite you? The Spektr Datalake team owns the central data lake for Advertising unifying Petabytes of data generated across the Ads pipeline such as campaigns, ad-serving, billing … to design and innovate solutions for this scale, delivering robust and scalable microservices built over Java and AWS as well as innovate with big data technologies like Spark, EMR, Athena and more. You will create value that materially impacts the speed and quality of decision making across the organization resulting in tangible business growth. Key job responsibilities - Engage with … and usability via simplified metrics and drive innovations to improve guarantees for our customers - Build frugal solutions that will help make Spektr data lake cost wise leanest datalake in Amazon About the team The mission of the Spektr Datalake team is to provide data that helps the advertising organization make informed analyses and decisions for our customers and to More ❯
Engineer and shape the future of our cloud-based communication systems. We are looking for an experienced hands-on engineer to support the development, deployment, and maintenance of our Amazon Connect-based contact centre environment. This role plays a key part in delivering digital routing solutions, supporting internal users including training, and ensuring our systems run smoothly and securely. … and working with 3rd party service providers. Experience, Knowledge, Skills Strong experience designing, deploying and maintaining contact centre environments (Avaya, Genesys etc.) including SIP Solid hands on knowledge of Amazon Connect, AWS Lambda, Lex Bots and DynamoDB Strong background in designing and implementing call routing systems, including routing profiles, queues, callbacks, emergency messaging, hours of operation, and holiday routing … Excellent communication and analytical skills, with the ability to coordinate effectively with sysadmins, developers and business stakeholders. Desirable Skills Experience with AWS services like Contact Lens, Transcribe, Comprehend, S3, Athena, Quick Sight, Lambda, Elastic Search, Kibana and Kinesis. AWS certification. Knowledge of AWS Workforce Management tools including Quality Monitoring scorin Experience with working in a regulated financial services environment More ❯
Maidenhead, Royal Borough of Windsor and Maidenhead, Berkshire, United Kingdom
Kensington Mortgages
Engineer and shape the future of our cloud-based communication systems. We are looking for an experienced hands-on engineer to support the development, deployment, and maintenance of our Amazon Connect-based contact centre environment. This role plays a key part in delivering digital routing solutions, supporting internal users including training, and ensuring our systems run smoothly and securely. … and working with 3rd party service providers. Experience, Knowledge, Skills Strong experience designing, deploying and maintaining contact centre environments (Avaya, Genesys etc.) including SIP Solid hands on knowledge of Amazon Connect, AWS Lambda, Lex Bots and DynamoDB Strong background in designing and implementing call routing systems, including routing profiles, queues, callbacks, emergency messaging, hours of operation, and holiday routing … Excellent communication and analytical skills, with the ability to coordinate effectively with sysadmins, developers and business stakeholders. Desirable Skills Experience with AWS services like Contact Lens, Transcribe, Comprehend, S3, Athena, Quick Sight, Lambda, Elastic Search, Kibana and Kinesis. AWS certification. Knowledge of AWS Workforce Management tools including Quality Monitoring scorin Experience with working in a regulated financial services environment More ❯
Brentwood, England, United Kingdom Hybrid / WFH Options
Sky
using Linux systems and networking protocols, including packet capture analysis Design and enhance operational tools and architect DevOps solutions to optimize system performance and efficiency. Leverage AWS technologies (S3, Athena, QuickSight) to analyse data from millions of field devices, delivering insights to inform decision-making and drive operational efficiency. Develop and implement anomaly detection techniques and data-driven solutions … technical decisions and convince others about the merits and reasons for those decisions. Experienced in Defect Tracking Tools such as Jira SCM Tools - Git & GitHub SQL and/or AmazonAthena experience. Team overview: The RDK Global Triage Team within the RDK development team plays a critical role in ensuring the quality and reliability of TV and Broadband More ❯
complex data pipelines. These pipelines ingest and transform data from diverse sources (e.g., email, CSV, ODBC/JDBC, JSON, XML, Excel, Avro, Parquet) using AWS technologies such as S3, Athena, Redshift, Glue, and programming languages like Python and Java (Docker/Spring) . What You’ll Do Lead the design and development of scalable data pipelines and ETL processes … for at least 5 years. What We’re Looking for Proven experience leading data engineering teams and delivering technical solutions Strong background in cloud data platforms, especially AWS (Redshift, Athena, EC2, IAM, Lambda, CloudWatch) Proficiency in automation tools and languages (e.g., GitHub/GitLab, Python, Java). Skilled in stakeholder engagement and translating requirements into actionable insights. Ability to More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Santander
need to be successful in this role: Experience developing, testing, and deploying data pipelines, data lakes, data warehouses, and data marts using ideally AWS services such as S3, Glue, Athena, EMR, Kinesis, and Lambda Understanding of the principles behind designing and implementing data lake, lake house and/or data mesh architecture Problem solving skill with basic knowledge of … Clear and effective communication skills to interact with team members, stakeholders and end users conveying technical concepts in a comprehensible manner Skills across the following data competencies: SQL (AWS Athena/Hive/Snowflake) Hadoop/EMR/Spark/Scala Data structures (tables, views, stored procedures) Data Modelling - star/snowflake Schemas, efficient storage, normalisation Data Transformation DevOps More ❯
of consolidating data sources, creating pipelines, and ideally a software engineering mindset. Excellent experience with Containers. Strong knowledge of Python, Airflow, SQL, and Linux. Proficient in AWS Lambda, ECS, Athena, S3, Kinesis, Glue, CloudWatch/Terraform/CDK, Tableau Online. They are particularly keen on Senior Data Engineers who utilize good AI-based tooling for productivity, such as Co More ❯
Cucumber) Linux environments AWS environment experience MariaDB, Oracle, MySQL, AWS Aurora (2 or more) 2+ years in MySQL scripting within cloud Data Migration projects AWS data migration testing, including Athena, Data Migration Service, or similar tools Additional skills include: Self-management and proactive decision-making with risk assessment Ability to translate technical concepts for non-technical stakeholders Knowledge sharing More ❯