such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau Experience with data pipeline tools such as DBT, Airflow or Luigi are a plus! Experience using cloud environments e.g. Azure or AWS Understanding of the Agile delivery method Working Conditions: · Permanent, London Chiswick more »
Experience in deploying solutions on a cloud platform - AWS (preferred), Azure, GCP Expertise in container technologies - Docker, Kubernetes Proficiency with workflow orchestration tools like ApacheAirflow to design, schedule and monitor data processing pipelines Experience on any visualization tool - Tableau (preferred), PowerBI, Spotfire YOU'LL WORK WITH Our more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with ApacheAirflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. 🛠Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt 🌳Environment: Agile ✍️Process: 3 stages No CV? No problem. Email me at athomas@trg-uk.com, and let’s arrange a call more »
existing systems and ingestion pipelines. Requirements: Proven experience working with Python or Java or C# Experience working with ELT/ELT technologies such as Airflow, Argo, Dagster, Spark, Hive Strong technical expertise, especially in data processing and exploration, with a willingness to learn new technologies. A passion for automation more »
Experience using cloud technologies such as EMR, Lambda, EC2, and data pipelines. Experience leading data warehousing and analytics projects, including using technologies such as Airflow, Jenkins, Snowflake, and Kinesis. Experience with Agile, DevOps, and CICD frameworks in cloud-based environments. Exposure to at least one dashboarding tool like Tableau more »
data and AI models. Data Engineer Required Experience Data engineering experience (2+ years) Cloud platform proficiency (e.g., AWS, Azure, GCP) Data pipeline development (e.g., Airflow, Apache Spark) SQL proficiency, database design Visualization tools knowledge (e.g., Tableau, PowerBI, Looker) Data Engineer Application Process This is a 1 year contract more »
really value their employees. As a testament to this, you’ll also receive an unrivalled benefits package. Tech: Snowflake, AWS (or Azure/GCP), Airflow, dbt TC: £85,000 + bonus + up to 22% pension Process: 2 stages No CV? No problem. Email me at athomas@trg-uk.com more »
Manchester Area, United Kingdom Hybrid / WFH Options
Your Next Hire
classes) Strong SQL experience (Required to optimise our database) GCP is preferable (AWS or Azure is fine) Able to modify classes & dags within an airflow An ability to self deploy & work autonomously Charles has worked with this team for a few years, apply to this advert with your CV more »
the business and our customers. About you; Proficiency with language/tools for data processing and analytics, such as SQL, Python/Scala, Spark, Airflow, etc Strong understanding of data architectures, data modelling, and designing scalable and fault-tolerant data pipelines and data lakes/warehouses. You’ve worked more »
or warehouse. Data Pipeline Development: Design and construct data pipelines to automate data flow, involving ETL processes as needed. Modern tech stack - Python, AWS, Airflow and DBT Must haves: A team player, happy to work with several teams, this is key as you will be reporting directly to the more »
Preferred Qualifications • Designing and implementing real-time pipelines. • Designing and implementing data pipelines for CV/ML systems. • Experience with workflow management engines (i.e. Airflow, Luigi, Prefect, Dagster, digdag.io, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M). • Experience with data quality and validation. • Experience more »
Preferred Qualifications * Designing and implementing real-time pipelines. * Designing and implementing data pipelines for CV/ML systems. * Experience with workflow management engines (i.e. Airflow, Luigi, Prefect, Dagster, Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M). * Experience with data quality and validation. * Experience querying more »
A commitment to mentoring and developing junior team members, coupled with a dedication to high standards in your work. Tech Stack : dbt, SQL, Snowflake, Airflow, AWS, Tableau. DE&I is also at the heart of the business and they strongly believe that ensuring diversity of background and experience will more »
the paradigm of statistical significance testing Desirable experience: Familiarity with energy data, smart grids, demand response, or related fields is a plus Experience with Airflow and/or Airbyte 5+ years of experience as a data scientist, leading a small team Expertise in Python (including asyncio) as a software more »
Familiarity with Kotlin or willingness to learn. Industrial experience with AWS/GCP/Azure. Knowledge of common data products such as Hadoop, Spark, Airflow, PostgreSQL, S3, etc. Problem solving/troubleshooting skills and attention to detail. 👋 About Us High-quality data access and provisioning shouldn't be daunting more »
experience working in cross-functional agile teams. Technologies such as Docker and orchestration tools like Kubernetes for containerized deployments. Workflow management tools such as Airflow for orchestrating complex data pipelines and ETL processes. Certifications in Azure cloud services and data engineering technologies, demonstrating expertise and proficiency in the Azure more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
role · Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. · Building orchestration for data pipelines using tools such as Airflow, Jenkins and GitHub actions. · Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala … system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. ApacheAirflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as ApacheAirflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
expertise with either AWS or GCP Strong python experience Exposure to medallion architecture 👍 Bonus points for: Supporting experience with tech like Athena, RedShift, BigQuery, Airflow, Kinesis, Kafka (or similar) The ability to contribute to technical decisioning Experience working with a high volume of data pipelines more »
to onboard strategies and desks efficiently Working with time-series data sets, and building the data infra for the desk Stack: Python, SQL, MongoDB, Airflow The firm can offer market leading salaries, collaboration with London's best technologists, and also a hybrid work model. If this sounds of interest more »