lead engineer is unavailable. The ideal candidate is detail-oriented, responsive, and experienced in cross-platform administration. Primary Responsibilities: Cloud & Storage Integration Assist in managing data pipelines between AWS S3, Azure Blob Storage, and Google Cloud Storage Support deployment and updates of serverless functions (e.g., Lambda, Azure Functions, Cloud Functions) Help maintain cloud storage access controls, secrets, and container More ❯
talent and have a diverse workforce. Your role will include: Design, develop, test, and deploy data integration processes (batch or real-time) in AWS using tools such as Redshift, S3, Glue, Athena, Lambda and Snowflake. Building pipelines, doing migrations, integration's with external systems. Solving simple and complex problems. Liaising with stakeholders across the business as an internal consultant. … to aid the implementation of a brand new Snowflake Data Warehouse . They are looking for a candidate that has experience in... AWS Data Platform, Strong knowledge of Snowflake, S3, Lambada, Data Modelling, DevOps Practices, Ariflow, DBT, Data Vault, Redshift, ODS, Data Vault experience, Strong SQL/Python. This role is an urgent requirement, there are limited interview slots More ❯
Business Intelligence Engineer - Locations considered: London, Paris, Madrid, Milan, Munich, Berlin, EU Heavy and Bulky Services Job ID: Amazon EU SARL (UK Branch) Locations considered: London, Paris, Madrid, Milan, Munich, Berlin Are you interested in building data warehouse and data lake solutions to shape the analytical backbone of the EU Heavy Bulky & Services team? We are hiring a … engineering mentality. We seek individuals that enjoy to collaborate across stakeholders and that bring excellent statistical and analytical abilities to the team. Heavy Bulky & Services is a growing Amazon business, designed to enable Customers to enjoy website browsing, product shopping and delivery experience for our specific product portfolio. We are looking for a Business Intelligence Engineer to support … SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling PREFERRED QUALIFICATIONS - Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift - Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Amazon is an equal opportunities employer. We More ❯
activities, including: Design, implement and maintain JAVA Microservices Integrate data from one system to another via REST API Become an expert on an AWS data workflow that includes Lambda, S3, and other such technologies Ingest and normalize data from multiple data sources Develop technical documentation and briefing materials to support program status reviews, control gates, and other presentations as … Technical skills: Knowledge of standard ETL tools, such as Pentaho, AWS Glue, etc. Understanding of JAVA Microservices; understanding of REST API; Understanding of AWS data workflow that includes Lambda, S3, and other such technologies EDUCATION/EXPERIENCE: Bachelors (Computer engineering, Computer Science, Electrical Engineering, Information systems, Information Technology, Cybersecurity, or a closely related discipline); Required Experience: 6+ yrs; Technical … skills: Understanding of JAVA Microservices; Understanding of REST API; Understanding of AWS data workflow that includes Lambda, S3, and other such technologies Current TS/SCI with Poly is required Pay Range: There are many factors that can influence final salary including, but not limited to, geographic location, Federal Government contract labor categories and contract wage rates, relevant prior More ❯
implementation and maintenance of JAVA Microservices Architect data integration from one system to another via REST API Become an expert on an AWS data workflow that includes AWS Lambda, S3, and other such technologies; as well as other cloud providers Architect and manage the ingestion and normalization of data from multiple data sources Develop technical documentation and briefing materials More ❯
Database Designer & Senior Data Engineer, with hands-on experience in cloud based data services. Proficiency in SQL, Python, or Scala. Experience with AWS RDS, AWS Glue, AWS Kinesis, AWS S3, Redis and/or Azure SQL Database, Azure Data Lake Storage , Azure Data Factory. Knowledge of data modelling , ETL processes, and data warehousing principles. Preferred Skills Strong problem-solving More ❯
experience in one or more of the following: PowerShell, Ansible, Puppet, Chef, VBScript, C++, Unix Shell, Python, Perl, Ruby, JavaScript, etc. • Basic understanding of storage protocols (e.g. NFS, SMB, S3, FC, iSCSI, NVMe) • Good knowledge of Active Directory, DHCP, DNS, Group Policy. • Knowledge of MS Windows Server operating systems including Server 2016, 2019, and 2022 as well as RHEL More ❯
multiple environments for the purpose of development, testing, and production hosting of custom web applications on behalf of a Government customer. The current hosting environment utilizes Windows within Amazon Web Services. We are looking for an ambitious system administrator to help migrate that environment to Linux and/or employ Docker and Kubernetes as a solution. More About … to include web servers, database servers, and associated development systems (e.g. IIS, WordPress, RDS Database). Working knowledge or familiarity of Amazon Web Services such as EC2, S3, RDS, and ELB. Maintains system reliability and uptime of production applications by measuring and monitoring availability, latency, and overall system health. Contributes to system configuration management in accordance with … Experience with administering Windows Experience administering and supporting databases such as MySQL WordPress and PHP Familiarity JIRA familiarity Knowledge of DNS, PKI and Certificates PowerShell Scripting Experience in Amazon Web Services (AWS) Systems administration AWS Certification (AWS Solutions Architect - Associate or AWS SysOps Administrator - Associate) - What You Can Expect: A culture of integrity. At CACI, we place character More ❯
more languages (Python, Ansible, Java, C++) • Provision cloud access and accounts • Systems optimization at the OS and infrastructure levels in an AWS cloud environment • AWS cloud administration (EC2, EBS, S3, IAM, Lambda) • RHEL systems administration (OS patching, shell scripting, storage management) • Collaborate with multidisciplinary teams of business analysts, developers, data scientists, and subject-matter experts Requirements: • DoD 8570 IAT More ❯
warehouses (Snowflake, Redshift, BigQuery). Experience with Python and shell scripting is a plus. Experience with Agile development methods. Understanding of Tableau. Experience with AWS cloud technologies such as S3, EMR. The ability to communicate ideas, information, and viewpoints clearly, both verbally and in writing. Experience with Dimensional Data modeling (star schemas). Demonstrates the Core Values of Delta More ❯
into CI/CD pipelines. Unix/Linux-based systems - for using command-line tools, scripting, and log analysis. AWS (or similar cloud provider) - with a focus on IAM, S3 access controls, and common misconfiguration risks. SQL/MongoDB/Oracle - for testing injection flaws, access controls, and data sanitisation. Karate DSL or Rest Assured - for automating security-focused More ❯
and Hadoop. Solid understanding of data governance principles, data modeling, data cataloging, and metadata management. Hands-on experience with cloud platforms like AWS or Azure, including relevant services like S3, EMR, Glue, Data Factory, etc. Proficiency in SQL and one or more programming languages (Python, Scala, or Java) for data manipulation and transformation. Knowledge of data security and privacy More ❯
administration skills. Ansible and Terraform experience. Strong Kubernetes experience, we want to see people who have worked on Kubernetes implementation, involved in architecture, hands on migration etc K8S, Spark S3 Engine Terraform Ansible CI/CD Hadoop Linux/RHEL - on prem background/container management Grafanna or Elastic Search- for observability Kubernetes security Observability - Open telemetry Argo Other More ❯
a strong AWS focus. Active TS/SCI security clearance. Must be located in Northern Virginia. Must be able to work 100% onsite. Deep expertise in AWS services: EC2, S3, VPC, IAM, CloudFormation, Control Tower. Proficiency in infrastructure automation tools like Terraform, Ansible, and CloudFormation. Skilled in scripting languages: Python, Bash. Familiarity with compliance standards such as FedRAMP and More ❯
of experience developing scalable backend systems Strong proficiency in Python, TypeScript, and cloud platforms (especially AWS) Hands-on experience with tools such as CDK, ECS, Step Functions, Lambda, and S3 Proficiency with GitHub or similar version control systems Ability to work independently and deliver features from design to deployment Strong analytical thinking and problem-solving skills Effective communicator with More ❯
Manage the process from design to release System Infrastructure and Cloud Migration: Build and manage environments using Windows Server and IIS Plan and execute migration to AWS (EC2, RDS, S3, CloudWatch, etc.) Propose and implement security and performance improvements Development Tools & Automation: Utilize version control tools (e.g., Git) and task management systems Plan and implement process improvements and automation … Work within a DevOps environment to integrate development and operations Candidate Requirements Experience designing and building AWS architectures Hands-on management of AWS services (EC2, RDS, S3, CloudWatch, etc.) Experience developing and maintaining financial systems Proficiency in SQL for queries and data extraction Experience with IIS and Windows Server administration Participation in system migration or replacement projects AWS certifications More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Sannik
Develop, modify, and maintain Python APIs for data extraction from the housing price model. • Ensure accurate extraction of data from the model, with a focus on AWS S3 storage. • Collaborate with the development team to implement architectural changes and system tweaks. • Work with AWS services such as Lambda, EMR, and DynamoDB to optimize data management processes. • Utilize Spark for … and cloud technologies. Bachelor's degree in Computer Science or a related field. Proficiency in Python programming is essential. Strong understanding of cloud technologies, including AWS services such as S3, Lambda, and DynamoDB. Demonstrated experience with database management and backend development. Familiarity with tools like EMR and Spark for data processing. Ability to work independently and as part of More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
team of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices … junior engineers and contribute to a culture of technical excellence Requirements: Proven experience in a Lead or Senior Data Engineering role Strong hands-on expertise with AWS (Glue, Lambda, S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt More ❯
team of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices … junior engineers and contribute to a culture of technical excellence Requirements: Proven experience in a Lead or Senior Data Engineering role Strong hands-on expertise with AWS (Glue, Lambda, S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt More ❯
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
team of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices … junior engineers and contribute to a culture of technical excellence Requirements: Proven experience in a Lead or Senior Data Engineering role Strong hands-on expertise with AWS (Glue, Lambda, S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
team of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices … junior engineers and contribute to a culture of technical excellence Requirements: Proven experience in a Lead or Senior Data Engineering role Strong hands-on expertise with AWS (Glue, Lambda, S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
team of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices … junior engineers and contribute to a culture of technical excellence Requirements: Proven experience in a Lead or Senior Data Engineering role Strong hands-on expertise with AWS (Glue, Lambda, S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
team of engineers, and work closely with cross-functional teams to deliver high-impact data solutions. Key Responsibilities: Architect and maintain robust data pipelines using AWS services (Glue, Lambda, S3, Airflow) Lead the migration and optimisation of data workflows into Snowflake. Collaborate with analysts, data scientists, and product teams to deliver clean, reliable data. Define and enforce best practices … junior engineers and contribute to a culture of technical excellence Requirements: Proven experience in a Lead or Senior Data Engineering role Strong hands-on expertise with AWS (Glue, Lambda, S3) Deep experience with Snowflake as a data warehouse. Proficiency in Python or Scala for data processing Excellent communication and stakeholder management skills Preferably some experience with Terraform and dbt More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
tooling. It suits someone who thrives in greenfield environments, enjoys client engagement, and values clean, scalable, well-documented engineering. Key Responsibilities: Design and build robust data pipelines using AWS (S3, Redshift, Glue, Lambda, Step Functions, DynamoDB). Deliver ETL/ELT solutions with Matillion and related tooling. Work closely with client teams to define requirements and hand over production … ready solutions. Own infrastructure and deployment via CI/CD and IaC best practices. Contribute to technical strategy and mentor junior engineers. Requirements: Strong hands-on AWS experience – S3, Redshift, Glue essential. Proven experience building ETL/ELT pipelines in cloud environments. Proficient in working with structured/unstructured data (JSON, XML, CSV, Parquet). Skilled in working with More ❯
tooling. It suits someone who thrives in greenfield environments, enjoys client engagement, and values clean, scalable, well-documented engineering. Key Responsibilities: Design and build robust data pipelines using AWS (S3, Redshift, Glue, Lambda, Step Functions, DynamoDB). Deliver ETL/ELT solutions with Matillion and related tooling. Work closely with client teams to define requirements and hand over production … ready solutions. Own infrastructure and deployment via CI/CD and IaC best practices. Contribute to technical strategy and mentor junior engineers. Requirements: Strong hands-on AWS experience – S3, Redshift, Glue essential. Proven experience building ETL/ELT pipelines in cloud environments. Proficient in working with structured/unstructured data (JSON, XML, CSV, Parquet). Skilled in working with More ❯