really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
the business and our customers. About you; Proficiency with language/tools for data processing and analytics, such as SQL, Python/Scala, Spark, Airflow, etc Strong understanding of data architectures, data modelling, and designing scalable and fault-tolerant data pipelines and data lakes/warehouses. You’ve worked more »
methodologies such as CI/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
A commitment to mentoring and developing junior team members, coupled with a dedication to high standards in your work. Tech Stack : dbt, SQL, Snowflake, Airflow, AWS, Tableau. DE&I is also at the heart of the business and they strongly believe that ensuring diversity of background and experience will more »
spanning a number of systems At least 10 years of relevant experience Hands-on in-depth experience in the following: Snowflake/DBT/Airflow Background/working experience in the following: Azure Power BI/DAX Traditional SQL (SqlServer, MySql, Postgres) JIRA, Confluence, (Github/BitBucket) Azure Data more »
familiarity with modern software engineering practices, including 12factor, CI/CD, and Agile methodologies. Deep understanding of Spark (PySpark), Python (Pandas), orchestration software (e.g. Airflow, Prefect) and databases, data lakes and data warehouses. Experience with cloud technologies, particularly AWS Cloud services, with a particular focus on using AWS Glue more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
role · Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. · Building orchestration for data pipelines using tools such as Airflow, Jenkins and GitHub actions. · Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala … system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. ApacheAirflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
data and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
ensuring best practises, quality in data transformation and modelling. Essential experience with tech including; GCP, SQL and DBT. Preferably working experience with: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any more »
range of databases. Snowflake is widely used, as are Docker and Kubernetes for containerisation. ETL and ELT tech are also used every day, primarily Airflow, Spark, Hive and a lot more. You’ll need to come from a strong academic background with some commercial experience in a data heavy more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
Bradford, England, United Kingdom Hybrid / WFH Options
HCLTech
pub/sub, dataflow, dataproc, big query, cloud sql) knowledge in containers and container orchestration CI/CD experience version control (GIT) Orchestration tools ( airflow or cloud composer more »
Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and provisioning shouldn't be daunting more »
experience working in cross-functional agile teams. Technologies such as Docker and orchestration tools like Kubernetes for containerized deployments. Workflow management tools such as Airflow for orchestrating complex data pipelines and ETL processes. Certifications in Azure cloud services and data engineering technologies, demonstrating expertise and proficiency in the Azure more »
our customers. Skills and experience we’re looking for: Experience of creating advanced visualisations in Tableau. Experience in SQL and Python. Experience of AWS, Airflow, S3 and working with Snowflake in a large complex organisation is advantageous. Experience of establishing processes to identify and managing issues, in data or more »
tools. In this role, you'll lead the development of robust, fully tested data pipelines in Python using cutting-edge platforms like Dagster/Airflow and apply your expertise in real-time data streaming solutions using Kafka. You'll play a key role in expanding their on prem data more »
data quality checks and monitoring processes. KEY SKILLS Proficiency in SQL and data querying validation and testing purposes. Hands on experience with Snowflake or Airflow or DBT. Familiarity with data integration, ETL processes, and data governance frameworks. Solid understanding of data structures, relational databases and data modelling concepts. Experience more »
learning systems. Expert in Python. Knowledge of common ML libraries like Pandas, NumPy and Scikit-Learn. Experience with ML workflow orchestration tools (Kubeflow, MLFlow, Airflow etc). Experience with Vertex AI Understanding of CI/CD pipelines and DevOps processes. Ability to collaborate with data scientists and software engineers. more »
algorithms Expertise in popular data science platforms such as Alteryx and Python, including libraries and frameworks like NumPy, SciPy, Pandas, NLTK, TensorFlow, PyTorch, and Airflow Strong understanding of statistical analysis, encompassing distributions, statistical testing, regression, and other techniques Experience handling unstructured data sets Familiarity with software engineering principles and more »