we will be happy to support you. KEYWORDS: Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/… Asset Management/Reinsurance/Big Data/Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/… Asset Management/Reinsurance/Big Data/Python/Google Cloud Platform/GCP/SQL/PostgreSQL/Pandas/SQLAlchemy/ApacheAirflow/DataBricks/Data Bricks/Snowflake/Luigi/BigTable/Redis/CouchDB/RethinkDB/Elasticsearch/Insurance/ more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
in SQL, NoSQL, Blob,Delta Lake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, ApacheAirflow, Logic Apps, DBT, SnapLogic, Spark or similar tools. Software Tooling - GIT/GitHub, CI/CD, deployment tools like Octopus, Terraform infrastructure more »
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
creating ETL pipelines in python * Exposure to analytical data warehouses such as Snowflake, Redshift or BigQuery (Redshift Preferred) * Experience with data orchestrators such as Airflow, AWS Step Functions, AWS Batch * Knowledge of Agile development methodologies * Knowledge of automated delivery processes * Experience designing and building autonomous data pipelines BENEFITS Competitive more »
Manchester Area, United Kingdom Hybrid / WFH Options
Your Next Hire
classes) Strong SQL experience (Required to optimise our database) GCP is preferable (AWS or Azure is fine) Able to modify classes & dags within an airflow An ability to self deploy & work autonomously Charles has worked with this team for a few years, apply to this advert with your CV more »
Southampton, Hampshire, South East, United Kingdom Hybrid / WFH Options
Datatech Analytics
record of taking data science products from conception through to deployment in production. You've had experience deploying data science workflows with tools like Airflow or Databricks. Proven experience of leading a data science team Proven experience of working with customer data You've a strong commitment to accuracy more »
role · Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. · Building orchestration for data pipelines using tools such as Airflow, Jenkins and GitHub actions. · Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala … system. · Significant experience with Python, and experience with modern software development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise … scheduling tools (e.g. ApacheAirflow, Spring DataFlow, Control-M) · Experience with Linux and containerisation What you’ll get in return ·Competitive base salary ·Up to 20% bonus ·25 days holiday ·BAYE, SAYE & Performance share schemes ·7% pension ·Life Insurance ·Work Away Scheme ·Flexible benefits package ·Excellent staff travel more »
Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, ApacheAirflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of more »
Lead Data Engineer Location: York - 2 days per week onsite/3 days WFH Start: ASAP Duration: 6-12 months Requirements: Snowflake DBT/Airflow Azure PowerBI/Dax Traditional SQL JIRA, Confluence, (GitHub/BitBucket) Azure Data Factory more »
Manchester Area, United Kingdom Hybrid / WFH Options
Maxwell Bond®
ensuring best practises, quality in data transformation and modelling. Essential experience with tech including; GCP, SQL and DBT. Preferably working experience with: Kafka, Dataform, Airflow, Tableau, PowerBI, Redshift, Snowflake, Terraform and BigQuery. This position does not offer Visa Sponsorship , please refrain from applying if you require sponsorship at any more »
Abingdon-On-Thames, England, United Kingdom Hybrid / WFH Options
Mirus Talent
mandatory, familiarity with the following technologies and tools would be advantageous: Dagster (or similar Orchestration Tools) : Experience with Dagster or other orchestration tools like Airflow for managing complex data workflows and pipelines. Qlik Sense Cloud (or similar Reporting Tools) : Knowledge of Qlik Sense Cloud or similar reporting tools such more »
Bradford, England, United Kingdom Hybrid / WFH Options
HCLTech
pub/sub, dataflow, dataproc, big query, cloud sql) knowledge in containers and container orchestration CI/CD experience version control (GIT) Orchestration tools ( airflow or cloud composer more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
diverse environments, leveraging Azure and other modern technologies. Proven ability to orchestrate complex data workflows and manage Kubernetes clusters on AKS , utilizing tools like Airflow, Kubeflow, Argo, and Dagster. Familiarity with data ingestion tools such as Airbyte and Fivetran, accommodating a wide array of data sources. Mastery of large more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
TMS (Tealium IQ, GTM and Adobe Dynamic Tag Manager) changes. Integrate data sources via web and REST APIs. Data pippingand modelling using SQL, DBT, Airflow, ETL, Data Warehousing, Redshift and Python. Transfer knowledge of the business processes and requirements to the development teams. Collaborate with Product, Marketing and Development more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
for advanced content analysis and indexing & developing RAG services . Experience in managing data workflows and Kubernetes clusters on AKS, utilizing tools such as Airflow, Kubeflow, Argo, and Dagster. Familiarity using scripting languages and tools such as Bash, PowerShell, Azure CLI, Terraform, and Helm Charts. Additional Information Location : This more »
NLP approaches like Word2Vec or BERT, including identifying the right KPIs and objective functions. Experience working with big data systems (Spark, EMR, Kafka, S3, Airflow) and programming languages (Java, Scala, or Python). Experience building in-production Machine Learning systems Good understanding of system architecture, including experience with big more »
Staines-Upon-Thames, England, United Kingdom Hybrid / WFH Options
IFS
Langchain, Semantic Kernel, and tools like MS tooling, Co-Pilot Studio, ML Studio, Prompt flow, Kedro, etc. Proficiency in pipeline orchestration with tools like Airflow, Kubeflow, and Argo . Exceptional communication skills, with the ability to articulate complex statistical concepts clearly. Personal Competencies: A results-oriented professional with a more »
on with projects but no requirements, previous experience in this is essential SKILLS AND EXPERIENCE NEEDED: Experience in Redshift database (AWS) Experience with DBT, Airflow and Fivetran Working collaboratively with multiple teams across the business Management/mentoring experience of a team INTERVIEW PROCESS: 1st Stage- Initial Chat 2nd more »
on with projects but no requirements, previous experience in this is essential SKILLS AND EXPERIENCE NEEDED: Experience in Redshift database (AWS) Experience with DBT, Airflow and Fivetran Working collaboratively with multiple teams across the business Management/mentoring experience of a team INTERVIEW PROCESS: 1st Stage- Initial Chat 2nd more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a more »
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
CD, and model monitoring. Proficiency in Python and relevant data manipulation and analysis libraries (e.g., pandas, NumPy). Experience with distributed computing frameworks like Apache Spark is a plus. Apache Spark and Airflow would be a bonus. Role overview: If you're looking to work with a more »
Role: Graduate Data Engineer Type: 12 months fixed-term Location: Peterborough Ready to utilise your skills to process and extract value from large datasets? Are you passionate about performing root cause analysis on various data? We have an exciting role more »
using Docker, Kubernetes and Cloud services. Experience with Azure stack will be an asset. Experience designing and implementing event-driven/microservices applications using Apache Kafka, Flink, etc. Exposure to model deployment and serving tools like Seldon Core, KServe, etc. Experience with drift detection and adaptation techniques as well … and familiarity with tools to manage infrastructure as code, like Terraform and package managers like Helm Charts. Proficiency with pipeline orchestration tools, such as Airflow, Kubeflow, and Argo Workflows. Outstanding communication skills, ability to convey complex technical concepts to non-technical stakeholders and collaborate with cross-functional teams. A more »