Maven, Jenkins, GitHub, etc. Experience with Amazon Web Services a strong plus - CloudFormation, EMR, S3, EC2, Athena etc. Experience with scheduling services such as Airflow, Oozie. Experience with Data ETL and data modeling Experience with building large-scale systems with extensive knowledge in data warehousing solutions. Developing prototypes and more »
tools. In this role, you'll lead the development of robust, fully tested data pipelines in Python using cutting-edge platforms like Dagster/Airflow and apply your expertise in real-time data streaming solutions using Kafka. You'll play a key role in expanding their on prem data more »
familiarity with modern software engineering practices, including 12factor, CI/CD, and Agile methodologies. Deep understanding of Spark (PySpark), Python (Pandas), orchestration software (e.g. Airflow, Prefect) and databases, data lakes and data warehouses. Experience with cloud technologies, particularly AWS Cloud services, with a particular focus on using AWS Glue more »
Data Engineer Azure/Snowflake/Power Bi/Airflow London (Contract) Role: Our client is looking to expand their data engineering team and add in the utilisation of Snowflake which is currently being used by a select few members of the team, and upskilling taking place alongside. They … has multiple years of experience using Snowflake as a data tool and can hit the ground running Experience Snowflake Financial services Cloud (Ideally azure) Airflow would be an advantage more »
data quality checks and monitoring processes. KEY SKILLS Proficiency in SQL and data querying validation and testing purposes. Hands on experience with Snowflake or Airflow or DBT. Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases and data modelling concepts. Experience more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Version 1
VERY OCCASIONAL TRAVEL TO CLIENT SITES AND OFFICE. Would you like to the opportunity to expand your skillset across Java, Python, Spark, Hadoop, Trino & Airflow across the Banking & Financial Services industries? How about if you worked with an Innovation Partner of the Year Winner (2023 Oracle EMEA Partner Awards … trends and best practices, and share knowledge with the team. Qualifications You will have expertise within the following: Java, Python, Spark, Hadoop (Essential) Trino, Airflow (Desirable) Architecture and capabilities. Designing and implementing complex solutions with a focus on scalability and security. Excellent communication and collaboration skills. Additional Information Why more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. ApacheAirflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with ApacheAirflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google more »
Hybrid ( 2 days a week) JD : Experience of working with Streaming & Batch technology stack – Confluent Kafka, Mongdb , Streamsets, IBM CDC, Hive, Hadoop, API, Informatica, Airflow, and other similar technologies SME level skills and experience of designing/architecting test automation solutions, ability to creatively problem solve is critical for more »
role · Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. · Building orchestration for data pipelines using tools such as Airflow, Jenkins and GitHub actions. · Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala … system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. ApacheAirflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (ApacheAirflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a more »
learning systems. Expert in Python. Knowledge of common ML libraries like Pandas, NumPy and Scikit-Learn. Experience with ML workflow orchestration tools (Kubeflow, MLFlow, Airflow etc). Experience with Vertex AI Understanding of CI/CD pipelines and DevOps processes. Ability to collaborate with data scientists and software engineers. more »
and maintain scalable data architectures, including pipelines and cloud-based data warehouses. Tech: Python (NumPy, Pandas), SQL, ETL, Cloud (AWS, Azure or GCP), Snowflake, Airflow, BigQuery, PowerBI/Tableau Industry: Fintech, Maritime trading Immersum are supporting the growth of a specialist consultancy who solely specialise in the Maritime trading … developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
projects and new innovations to support company growth and profitability. Our Tech Stack Python Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge of cloud architecture Good experience in Terraform Expert experience with database systems (snowflake, sql, postgres etc.) Experience of micro-service development and more »