strong focus on NLP, semantic search, and other advanced natural language processing techniques. Proven experience with MLOps, data platforms (e.g., Snowflake), data pipelines (e.g., Airflow), and messaging platforms (e.g., Kafka), across multiple geographic regions. Strong background in data architecture, software architecture, and distributed systems, with experience coordinating technical efforts More ❯
technical leadership role on projects and experience with transitioning projects into a support program. Experience with Google Cloud Platform (GCP) services, including Cloud Composer (ApacheAirflow) for workflow orchestration. Strong experience in Python with demonstrable experience in developing and maintaining data pipelines and automating data workflows. Proficiency in … e.g., Git). Strong expertise in Python, with a particular focus on libraries and tools commonly used in data engineering, such as Pandas, NumPy, Apache Airflow. Experience with data pipelines, ELT/ETL processes, and data wrangling. Dashboard analytics (PowerBI, Looker Studio or Tableau) experience. Excellent English, written and More ❯
orchestration tools (e.g., Docker, Kubernetes). Experience with infrastructure-as-code tools (e.g., Terraform, CloudFormation). Experience with development, ETL tools, and frameworks (e.g., ApacheAirflow, Talend, Fivetran). Proficiency in SQL and experience with Python, Java, or Scala. Familiarity with cloud platforms (AWS, GCP, Azure) and associated More ❯
and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (ApacheAirflow, Kubernetes). If you are a passionate and experienced Senior Data Engineer seeking a Lead role, or a Lead Data Engineer aiming More ❯
and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (ApacheAirflow, Kubernetes). If you are a passionate and experienced Senior Data Engineer seeking a Lead role, or a Lead Data Engineer aiming More ❯
Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
. Develop and manage data warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as ApacheAirflow, AWS Glue, and Amazon Athena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and More ❯
and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, ApacheAirflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management : Lead post-incident reviews, perform root cause analysis for data More ❯
implement elegant solutions for them. Are a data enthusiast who wants to be surrounded by brilliant teammates and huge challenges. Bonus Points: Experience with ApacheAirflow, including designing, managing, and troubleshooting DAGs and data pipelines. Experience with CI/CD pipelines and tools like Jenkins, including automating the More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and More ❯
with NoSQL databases (e.g., MongoDB) and relational databases (e.g., PostgreSQL, MySQL). 5+ years in Python and SQL work. Knowledge of ETL tools (e.g., ApacheAirflow) and cloud platforms (e.g., AWS, Azure, GCP). Understand data modelling concepts and best practices. Experience with healthcare data standards (e.g., HL7 More ❯
and flexible systems. Influence Opinion and decision-making across AI and ML Skills Python SQL/Pandas/Snowflake/Elasticsearch Docker/Kubernetes Airflow/Spark Familiarity with GenAI models/libraries Requirements 6+ years of relevant software engineering experience post-graduation A degree (ideally a Master’s More ❯
and flexible systems. Influence Opinion and decision-making across AI and ML Skills Python SQL/Pandas/Snowflake/Elasticsearch Docker/Kubernetes Airflow/Spark Familiarity with GenAI models/libraries Requirements 6+ years of relevant software engineering experience post-graduation A degree (ideally a Master’s More ❯
and client-facing solutioning . Deep understanding of data modeling, ETL/ELT, performance tuning, and security within Snowflake. Familiarity with tools like DBT, Airflow, Python, SQL, and cloud platforms (AWS/Azure/GCP). Excellent communication and interpersonal skills; ability to interface effectively with both technical and More ❯
and client-facing solutioning . Deep understanding of data modeling, ETL/ELT, performance tuning, and security within Snowflake. Familiarity with tools like DBT, Airflow, Python, SQL, and cloud platforms (AWS/Azure/GCP). Excellent communication and interpersonal skills; ability to interface effectively with both technical and More ❯
SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Saragossa
SQL and experienced with various database technologies. Knowledge of Python or Java, with the ability to leverage either in building scalable solutions. Experience with Airflow & other big data technologies is useful. Familiarity with DevOps practices and tools, including CI/CD pipelines. Previous experience with reporting tools is helpful More ❯
multi step pipeline) About you: Strong Python Experience building complex data transformation pipelines Experience with Databricks at scale, preferably experience with Iceberg Experience with Airflow or dagster Experience with AWS & open source technologies on top of dataweave Desirable - Medical data exposure is useful but video/image data, not More ❯
multi step pipeline) About you: Strong Python Experience building complex data transformation pipelines Experience with Databricks at scale, preferably experience with Iceberg Experience with Airflow or dagster Experience with AWS & open source technologies on top of dataweave Desirable - Medical data exposure is useful but video/image data, not More ❯
in modern data platforms, tools, and technologies, such as: - Advanced data modelling - operational and analytical - Python, SQL - Databricks, Spark - Orchestration frameworks such as Dataform, Airflow, and GCP Workflows. - Modern architecture and cloud platforms (GCP, AWS, Azure) - DevOps practices - Data warehouse and data lake design and implementation - Familiarity with containerization More ❯
in modern data platforms, tools, and technologies, such as: - Advanced data modelling - operational and analytical - Python, SQL - Databricks, Spark - Orchestration frameworks such as Dataform, Airflow, and GCP Workflows. - Modern architecture and cloud platforms (GCP, AWS, Azure) - DevOps practices - Data warehouse and data lake design and implementation - Familiarity with containerization More ❯
Pandas, Matplotlib). Comfortable deploying applications in cloud ecosystems, particularly AWS, and using infrastructure automation tools. Experienced with building data workflows and managing databases; Airflow and SQL are especially valued. A pragmatic engineer who follows clean development practices—version control, testing, packaging, etc. Analytical and curious, with a background More ❯