algorithms (supervised, unsupervised, deep learning), including model evaluation, explainability, and selection for business-critical use cases Strong hands-on experience with cloud infrastructure (AWS), containerization (Docker), and orchestration (Jenkins, Airflow) Proven capability in MLOps, including CI/CD pipelines, model monitoring, versioning, and automated retraining Experience deploying and serving models through APIs (e.g., Flask, FastAPI) in both real-time More ❯
data platforms. Implement data transformation logic using PySpark, Python, or SQL within AWS Glue or Lambda. Monitor, schedule, and orchestrate ETL workflows using AWS Step Functions, Glue Workflows, or ApacheAirflow on Amazon MWAA. Ensure data quality, consistency, and lineage using AWS Glue Data Catalog and AWS Lake Formation. Optimize ETL performance and cost-efficiency through partitioning, parallelism More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Ubique Systems
modelling, data warehousing concepts, and data integration best practices Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Familiarity with data orchestration tools, such as ApacheAirflow Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment. Experience with code versioning tools (e.g., Git) Meticulous attention to detail More ❯
ability to work independently and show initiative. Strong data manipulation, pipeline, and integration software programming and/or scripting skills in areas such as Python, ETL, APIs, SQL, SOQL, ApacheAirflow, Bash, Logstash, or equivalent. Significant experience successfully processing major data format types, structures, and access methods, such as csv, Excel, JSON, RDBMS, hashes, arrays, structured and unstructured More ❯
Lead Data Engineer - Python, SQL, GCP, Kubernetes, DBT, Airflow and BigQurey OB have partnered with a leading scale-up business in the AdTech space, where Data is at the forefront of everything they do, and they are currently hiring for a Lead Data Engineer to join their team and work on the development of their Data platform click apply More ❯
and cloud projects for the government. We're looking to hire senior Data Platform engineers, who have some consultancy experience. Skills S3/Redshift Terraform CI/CD (Gitlab) Airflow YAML More ❯
Knutsford, Cheshire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
collaborate across teams. What We're Looking For: Proven experience with BigQuery, dbt/dataform, and Tableau. Strong SQL and modern data architecture knowledge. Familiarity with orchestration tools (Prefect, Airflow), Git, CI/CD, and Python. Excellent communication and stakeholder engagement skills. Ready to make an impact? Apply now More ❯
Sacramento, California, United States Hybrid / WFH Options
KK Tech LLC
Strong hands-on experience with SQL, ETL processes, and data warehousing concepts. Knowledge of Snow pipe, Tasks, Streams, and Time Travel features. Experience with Python or ETL tools (e.g., Airflow, Talend, Informatica) is a plus. Fluent in Mandarin Chinese (spoken and written) and proficient in English. More ❯
Terraform (Cloud/Open Source) for cloud infrastructure automation. Proficiency with Databricks: clusters, jobs, notebooks, DBFS, and workspace management. Hands-on experience with MLOps tools (e.g., MLflow, TFX, Kubeflow, Airflow, or SageMaker Pipelines Experience deploying and scaling Large Language Models (LLMs) in production environments. Proficiency in CI/CD tooling (GitHub Actions, Azure DevOps, GitLab CI, etc. Scripting and More ❯
NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport More ❯
NoSQL Specialization in designing and implementing large-scale data pipelines, ETL processes, and distributed systems Should be able to work independenty with minimal help/guidance Good Understanding of Airflow,Data Fusion and Data Flow Strong Background and experience in Data Ingestions,Transformation,Modeling and Performance tuning. One migration Experience from Cornerstone to GCP will be added advantage Suppport More ❯
have strong, proven experience in a contracting capacity with: Python SQL Azure Fabric Highly advantageous if you have: dbt (Data Build Tool) Data Warehousing expertise Modern orchestration tools (e.g., Airflow, Dagster, Prefect) Public sector experience React Clearance: You must hold active SC Clearance to be considered. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tact
have strong, proven experience in a contracting capacity with: Python SQL Azure Fabric Highly advantageous if you have: dbt (Data Build Tool) Data Warehousing expertise Modern orchestration tools (e.g., Airflow, Dagster, Prefect) Public sector experience React Clearance: You must hold active SC Clearance to be considered. More ❯
challenges. What you'll need: Solid hands-on experience with Python and SQL Strong knowledge of data pipelines, cloud infrastructure (AWS & Azure), and integration tools Familiarity with tools like Airflow and version control, GIT. Confident working with structured and unstructured datasets Comfortable contributing to both technical delivery and collaborative problem-solving Salary and Benefits: This role is paying a More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Peaple Talent
challenges. What you'll need: Solid hands-on experience with Python and SQL Strong knowledge of data pipelines, cloud infrastructure (AWS & Azure), and integration tools Familiarity with tools like Airflow and version control, GIT. Confident working with structured and unstructured datasets Comfortable contributing to both technical delivery and collaborative problem-solving Salary and Benefits: This role is paying a More ❯
. Advanced SQL and dimensional data modelling skills (fact/dimension design, hierarchies, SCDs). Proven experience building ETL/ELT pipelines using tools such as SSIS , dbt , or Airflow . Solid understanding of database administration , tuning, and performance optimisation across MSSQL and PostgreSQL . Key Responsibilities: Design and maintain data models that meet business requirements, ensuring scalability, consistency More ❯
. Advanced SQL and dimensional data modelling skills (fact/dimension design, hierarchies, SCDs). Proven experience building ETL/ELT pipelines using tools such as SSIS , dbt , or Airflow . Solid understanding of database administration , tuning, and performance optimisation across MSSQL and PostgreSQL . Key Responsibilities: Design and maintain data models that meet business requirements, ensuring scalability, consistency More ❯
warehousing and transformation. Strong SQL skills and understanding of modern data architecture principles. Hands-on experience with Tableau for enterprise-grade dashboard development. Familiarity with orchestration tools (e.g., Prefect, Airflow), data quality frameworks, and metadata tools. Proficiency in Git, CI/CD, and scripting with Python. Excellent communication skills and ability to work collaboratively across technical and business teams. More ❯
and PySpark Exposure to generative AI platforms or a passion for building AI-powered solutions Ability to lead client delivery in dynamic, fast-paced environments Familiarity with tools like Airflow, Databricks or DBT is a plus What's on offer: Salary up to £75,000 Bonus and equity options Hybrid working: 3 days in a vibrant central London office More ❯
warehousing and transformation. Strong SQL skills and understanding of modern data architecture principles. Hands-on experience with Tableau for enterprise-grade dashboard development. Familiarity with orchestration tools (e.g., Prefect, Airflow), data quality frameworks, and metadata tools. Proficiency in Git, CI/CD, and scripting with Python. Excellent communication skills and ability to work collaboratively across technical and business teams. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and PySpark Exposure to generative AI platforms or a passion for building AI-powered solutions Ability to lead client delivery in dynamic, fast-paced environments Familiarity with tools like Airflow, Databricks or DBT is a plus What's on offer: Salary up to £75,000 Bonus and equity options Hybrid working: 3 days in a vibrant central London office More ❯
architecture, build data pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯
architecture, build data pipelines, and integrate AI/LLM components into production systems. Role Breakdown: 50% Backend Engineering: FastAPI, Flask, Node.js, CI/CD 30% Data Engineering: ETL, DBT, Airflow 20% AI/LLM Integration: LangChain, RAG pipelines, orchestration Key Responsibilities: Design and build backend services to support AI agent deployment Develop scalable data pipelines and integration layers Implement More ❯