Manchester, Greater Manchester, United Kingdom Hybrid / WFH Options
AutoTrader UK
robust and scalable web hosting and data platforms. Our platform is a layer on top of core Open Source technologies such as Kubernetes, Istio, Airflow, dbt, running in Public Cloud. It is the glue that allows our teams to deploy into production environments 100s of times per day with more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with ApacheAirflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
Experience using cloud technologies such as EMR, Lambda, EC2, and data pipelines. Experience leading data warehousing and analytics projects, including using technologies such as Airflow, Jenkins, Snowflake, and Kinesis. Experience with Agile, DevOps, and CICD frameworks in cloud-based environments. Exposure to at least one dashboarding tool like Tableau more »
data and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com more »
out on the front end! YOUR EXPERIENCE Python Cloud experience - AWS/GCP/Azure CI/CD Data modeling experience will be useful Airflow & DBT experience will be useful THE BENEFITS An education budget is available to learn and develop with the company Matched pension Travel budget in more »
Manchester Area, United Kingdom Hybrid / WFH Options
Your Next Hire
classes) Strong SQL experience (Required to optimise our database) GCP is preferable (AWS or Azure is fine) Able to modify classes & dags within an airflow An ability to self deploy & work autonomously Charles has worked with this team for a few years, apply to this advert with your CV more »
the business and our customers. About you; Proficiency with language/tools for data processing and analytics, such as SQL, Python/Scala, Spark, Airflow, etc Strong understanding of data architectures, data modelling, and designing scalable and fault-tolerant data pipelines and data lakes/warehouses. You’ve worked more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
Preferred Qualifications * Designing and implementing real-time pipelines. * Designing and implementing data pipelines for CV/ML systems. * Experience with workflow management engines (i.e. Airflow, Luigi, Prefect, Dagster, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M). * Experience with data quality and validation. * Experience querying more »
Preferred Qualifications * Designing and implementing real-time pipelines. * Designing and implementing data pipelines for CV/ML systems. * Experience with workflow management engines (i.e. Airflow, Luigi, Prefect, Dagster, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M). * Experience with data quality and validation. * Experience querying more »
A commitment to mentoring and developing junior team members, coupled with a dedication to high standards in your work. Tech Stack : dbt, SQL, Snowflake, Airflow, AWS, Tableau. DE&I is also at the heart of the business and they strongly believe that ensuring diversity of background and experience will more »
the paradigm of statistical significance testing Desirable experience: Familiarity with energy data, smart grids, demand response, or related fields is a plus Experience with Airflow and/or Airbyte 5+ years of experience as a data scientist, leading a small team Expertise in Python (including asyncio) as a software more »
Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and provisioning shouldn't be daunting more »
experience working in cross-functional agile teams. Technologies such as Docker and orchestration tools like Kubernetes for containerized deployments. Workflow management tools such as Airflow for orchestrating complex data pipelines and ETL processes. Certifications in Azure cloud services and data engineering technologies, demonstrating expertise and proficiency in the Azure more »
expertise in tools spanning: data warehousing, ETL, internal visualisation and analytics. Good examples are Snowflake, GCP, Azure Analytics, Sagemaker, Databricks, Tableau, PowerBI, Looker, Quicksight, Airflow, astronomer.io, Alteryx, Collibra.• Hands on experience and\or a detailed and deep understanding of the workflow and approaches of:o BI analyst for visualisation more »
role · Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. · Building orchestration for data pipelines using tools such as Airflow, Jenkins and GitHub actions. · Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala … system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. ApacheAirflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
Python or Bash for automation and infrastructure management tasks. Basic understanding of CI/CD pipelines and version control systems like Git. Exposure to ApacheAirflow, Informatica, or similar data integration tools. more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
expertise with either AWS or GCP Strong python experience Exposure to medallion architecture 👍 Bonus points for: Supporting experience with tech like Athena, RedShift, BigQuery, Airflow, Kinesis, Kafka (or similar) The ability to contribute to technical decisioning Experience working with a high volume of data pipelines more »