City of London, London, United Kingdom Hybrid / WFH Options
IO Associates
x Contract Data Engineers - Snowflake/AWS/Python/Airflow/Iceberg Location: London (Hybrid - 3 days per week onsite) Duration: 6 months Day Rate: £550 - £600 (Inside IR35) A highly reputable consultancy is seeking 2 x Contract Data Engineers to join their data team on a 6-month engagement. You will play a key role in building … and reporting capabilities. Key Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates More ❯
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
searchability within downstream enterprise tools. Your work will be pivotal in implementing cutting-edge technology, including AI models, to improve data fidelity and accelerate data engineering tasks. Familiarity with ApacheAirflow and Apache Hop will be beneficial as you build new and repeatable workflows to support the team's objectives. Your success in this position will be … have what it takes? Active TS/SCI with Polygraph required. Bachelor's degree in computer science, Software Engineering, or related field. Required Skills : Python Structured Query Language (SQL) ApacheAirflowApache Hop Experience normalizing raw, unstructured, and structured data Ability to work independently and collaboratively Experience building, optimizing, and implementing ETL processes and data pipelines Experience More ❯
and DataOps as well as System engineers to support both data and application integrations using bespoke tools written in Python/Java, as well as tools such as Meltano, Airflow, Mulesoft/Snaplogic, Apache NIFI, and Kafka, ensuring a robust, well-modelled, and scalable data analytics infrastructure running on MySQL and Postgres style databases primarily. Requirements: Advanced SQL … compliance) Proficiency in ELT/ETL processes Strong experience in data ingestion, transformation & orchestration technology (ETL tools such as Informatica, Datastage, SSIS, etc ) or open source Meltano, Airbyte, and Airflow Proven experience with DBT (data build tool) Proficiency with business intelligence tools (Power BI, Tableau, SAP BI, or similar). Integration & Programming Hands-on experience with API development and … integration (REST/SOAP) Proficiency in at least 1 object/procedural/functional language (e.g: Java, PHP, Python) Familiarity with EAI tools such as MuleSoft/SnapLogic or Apache NiFi Experience with infrastructure-as-code tools such as Terraform and Ansible Experience with version control (e.g. Git, SVN) and CI/CD workflows for deployment Experience scraping external More ❯
San Antonio, Texas, United States Hybrid / WFH Options
IAMUS
Tools: Proficiency with Maven and GitLab. Data Formats: Familiarity with JSON, XML, SQL, and compressed file formats. Configuration Files: Experience using YAML files for data model and schema configuration. Apache NiFi: Significant experience with NiFi administrationand building/troubleshooting data flows. AWS S3: bucket administration. IDE: VSCode, Intellij/Pycharm, or other suitable Technical Expertise : ETL creation and processing … experience in cyber/network security operations. Familiarity with Agile environments. Good communication skills. Developed documentation and training in areas of expertise. Amazon S3, SQS/SNS Admin experience ApacheAirflow Workloads via UI or CLI a plus Experience with Mage AI a plus Kubernetes, Docker More ❯
TransferGo. Our products are recognised by industry leaders like Gartner's Magic Quadrant, Forrester Wave and Frost Radar. Our tech stack: Superset and similar data visualisation tools. ETL tools: Airflow, DBT, Airbyte, Flink, etc. Data warehousing and storage solutions: ClickHouse, Trino, S3. AWS Cloud, Kubernetes, Helm. Relevant programming languages for data engineering tasks: SQL, Python, Java, etc. What you … methodologies. Collaborating with stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as ApacheAirflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data More ❯
Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing More ❯
containerization and CI/CD tools (e.g., Docker, GitHub Actions). Knowledge of networking and cloud infrastructure (e.g., AWS, Azure). Experience with modern data processing frameworks (e.g., dbt, ApacheAirflow, Spark, or similar). Requirements A strong focus on system observability and data quality. Emphasis on rapid scalability of solutions ( consider market ramp up when entering a More ❯
Computer Science, Engineering, or a related field, or equivalent industry experience. Preferred Qualifications Experience or interest in mentoring junior engineers. Familiarity with data-centric workflows and pipeline orchestration (e.g., ApacheAirflow). Proficiency in data validation, anomaly detection, or debugging using tools like Pandas, Polars, or data.table/R. Experience working with AWS or other cloud platforms. Knowledge More ❯
with interface/API data modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, ApacheAirflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business empowering local teams with More ❯
Experience with rapid prototype creation and deployment is a must. Comfortable working under tight deadlines as necessary. Possible Tech Stack: Programming Languages: Python, Java, or Go. Data Engineering Tools: Apache Kafka, Airflow (for orchestration), Spark (if needed for larger datasets). OpenSearch/Elasticsearch: Indexing, querying, and optimizing. Visualization Tools: Kibana, Grafana (for more advanced visualizations), React.js. Cloud More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
Noblis
engineering, mathematics, and physics. 5+ years of professional work experience. Programming knowledge in some of the following languages: Python, SQL, Java. Strong experience with workflow orchestration tools such as ApacheAirflow or Prefect. Database experience with some of the following products: Elasticsearch, PostgreSQL, MySQL, Oracle, MongoDB Experience working with Docker, Kubernetes Proficient with git and pull request workflows. More ❯
infrastructure. You will have good software development experience with Python coupled with strong SQL skills. In addition, you will also have a strong desire to work with Docker, Kubernetes, Airflow and the AWS data technologies such as Athena, Redshift, EMR and various other tools in the AWS ecosystem. You would be joining a team of 25+ engineers across mobile … skills Familiarity with continuous integration, unit testing tools and related practices Understanding of Agile Scrum software development lifecycle What you'll be doing: Implementing and maintaining ETL pipelines using Airflow & AWS technologies Contributing to data-driven tools owned by the data engineering team, including content personalisation Responsibility of ingestion framework and processes Helping monitor and look after our data More ❯
Maintain and develop data warehouses Provide guidance and mentorship to junior team members To be successful in this role you will have. Extensive experience with Snowflake Experience with DBT, Airflow or Python Cloud Data Engineering experience with AWS/Azure/GCP This is a hybrid role based from the companies London office with some of the benefits including More ❯
Maintain and develop data warehouses Provide guidance and mentorship to junior team members To be successful in this role you will have. Extensive experience with Snowflake Experience with DBT, Airflow or Python Cloud Data Engineering experience with AWS/Azure/GCP This is a hybrid role based from the companies London office with some of the benefits including More ❯
Science, Software Engineering, or a related field. Extensive experience with a broad range of data technologies, platforms, and languages, such as SQL, Python, Java, and data orchestration tools like Apache Airflow. Demonstrated expertise in modern data architecture principles, including data lakes, data warehousing, ETL processes, and real-time data processing. Proficiency in Agile methodologies tailored for data-centric projects More ❯
Out in Science, Technology, Engineering, and Mathematics
modern data libraries (e.g. Pandas, PySpark, Dask). Strong SQL skills and experience with cloud-native data tools (AWS, GCP, or Azure). Hands-on experience with tools like Airflow, Spark, Kafka, or Snowflake. Experience working with unstructured data, NLP pipelines, and time-series databases. Familiarity with deploying AI/ML models and supporting MLOps workflows. Interest in or More ❯
Excellent problem-solving and analytical skills Strong communication and collaboration abilities to work effectively with cross-functional teams. Nice to Haves: Experience with data pipeline orchestration tools such as Apache Airflow. Knowledge of data streaming technologies like Kafka. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data processing services. Exposure to data visualization tools and techniques More ❯
plenty of incredible developers at Octopus Energy who are willing to teach if you're willing to learn! Required experience: Python Git Nice to have: SQL dbt Github CircleCI Airflow Kubernetes Terraform A Cloud warehouse provider e.g. Databricks, GCP, Snowflake AWS We aren't necessarily looking for someone who is "10-out-of-10" in all these areas; but More ❯
to talk to you if: You've led technical delivery of data engineering projects in a consultancy or client-facing environment You're experienced with Python, SQL, .NET, dbt, Airflow and cloud-native data tools (AWS, GCP or Azure) You have strong knowledge of data architecture patterns - including Lakehouse and modern warehouse design (e.g. Snowflake, BigQuery, Databricks) You know More ❯
data modeling best practices and version controlling of software and configuration among development, integration, and production environments. Requirements: Experience with tools for data routing and transformation such as NiFi, Airflow, or dbt. Strong in Java, Python, or SQL ETL/ELT data pipelines development supporting data-driven applications. Proficient in scripting such as Linux shell, Python, Perl, Javascript, or More ❯
Out in Science, Technology, Engineering, and Mathematics
learn to name a few of the open-source libraries we use extensively. We implement the systems that require the highest data throughput in Java and C++. We use Airflow for workflow management, Kafka for data pipelines, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker and More ❯
Experience with Python or other scripting languages Good working knowledge of the AWS data management stack (RDS, S3, Redshift, Athena, Glue, QuickSight) or Google data management stack (Cloud Storage, Airflow, Big Query, DataPlex, Looker) About Our Process We can be flexible with the structure of our interview process if someone's circumstances or timescales require it but our general More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
architectures, data pipelines, and ETL processes Hands-on experience with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to More ❯