Poole, Dorset, United Kingdom Hybrid / WFH Options
Aspire Personnel Ltd
needs into technical requirements. Possess outstanding communication and interpersonal skills, facilitating clear and effective collaboration within and outside the team. Desirables: Familiarity with the ApacheAirflow platform. Basic knowledge of BI tools such as Power BI to support data visualization and insights. Experience with version control using GIT More ❯
tuning and optimisation Solid understanding of data warehousing principles, data modelling practice, Excellent knowledge of creation and maintenance of data pipelines - ETL Tools (e.g. ApacheAirflow) and Streaming processing tools (e.g. Kinesis) Strong problem-solving and analytical skills, with the ability to troubleshoot and resolve complex data-related More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Datatech Analytics
tuning and optimisation Solid understanding of data warehousing principles, data modelling practice, Excellent knowledge of creation and maintenance of data pipelines - ETL Tools (e.g. ApacheAirflow) and Streaming processing tools (e.g. Kinesis) Strong problem-solving and analytical skills, with the ability to troubleshoot and resolve complex data-related More ❯
Middlesex, Greater London, United Kingdom Hybrid / WFH Options
Datatech
tuning and optimisation ·Solid understanding of data warehousing principles, data modelling practice, ·Excellent knowledge of creation and maintenance of data pipelines - ETL Tools (e.g. ApacheAirflow) and Streaming processing tools (e.g. Kinesis) ·Strong problem-solving and analytical skills, with the ability to troubleshoot and resolve complex data-related More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
tuning and optimisation Solid understanding of data warehousing principles, data modelling practice, Excellent knowledge of creation and maintenance of data pipelines - ETL Tools (e.g. ApacheAirflow) and Streaming processing tools (e.g. Kinesis) Strong problem-solving and analytical skills, with the ability to troubleshoot and resolve complex data-related More ❯
and ML scientists to plan the architecture for end-to-end machine learning workflows. Implement scalable training and deployment pipelines using tools such as ApacheAirflow and Kubernetes. Perform comprehensive testing to ensure reliability and accuracy of deployed models. Develop instrumentation and automated alerts to manage system health More ❯
Requirements: Significant experience designing and scaling cloud-based data lake architectures (ideally Delta Lake on Databricks or similar) Deep expertise in workflow orchestration using Airflow, with production-grade DAGs and solid dependency management, knowledge of dbt is preferable Strong background in building reliable, scalable batch and streaming pipelines using More ❯
Experience with cloud platforms (AWS, Azure or GCP) and infrastructure-as-code tools Proficiency in spatial databases (e.g., PostGIS) and data pipeline tools (e.g., Airflow) Familiarity with geospatial APIs, formats (GeoJSON, WKT) and data services Security Clearance: Candidates must be eligible to pass Baseline Personnel Security Standard (BPSS) checks. More ❯
Data Engineers! What skills you need...... Well versed working with Python Deep understanding of SQL Experienced in Spark and/OR PySpark. AWS Glue & Airflow experience is ideal for this position. Building, developing and maintaining robust data pipelines Good understand of working API's, databases etc Deep cloud knowledge More ❯
Python, dbt, and data modelling, as well as experience or a strong interest in blockchain technology. Twinstake utilises a modern data stack including Airbyte, Airflow, Snowflake, dbt and Quicksight. What you will contribute: Data Modelling : Building scalable data models to transform complex datasets into actionable insights, using advanced SQL More ❯
Python, dbt, and data modelling, as well as experience or a strong interest in blockchain technology. Twinstake utilises a modern data stack including Airbyte, Airflow, Snowflake, dbt and Quicksight. What you will contribute: Data Modelling : Building scalable data models to transform complex datasets into actionable insights, using advanced SQL More ❯
management processes, and data profiling. Experience in developing APIs and working with WebSockets. Knowledge of React, Django, FastAPI, or equivalent technologies. Previous experience with AirFlow, Linux shell commands and setup websites on IIS. What we would like from you: Bachelor's or Master's degree in a relevant field. More ❯
Menlo Park, California, United States Hybrid / WFH Options
GRAIL Inc
Demonstrated positive collaboration skills with key stakeholders and system users to deliver robust systems and compelling results Experience with AWS (Redshift, S3, Glue, Managed Airflow) Snowflake, Tableau Experience with SaaS application data sets (Netsuite, Salesforce, Workday, Coupa) Life sciences/BioTech sector experience in a regulated environment The expected More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Gridiron IT Solutions
Web Managed Services (AWS). Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS, SNS, Kafka, AWS Lambda, NiFi, Airflow, or similar. Proficient experience utilizing JavaScript, Elasticsearch, JSON, SQL, XML. Working knowledge with datastores MongoDB/DynamoDB, PostgreSQL, S3, Redshift, JDBC/ODBC, and More ❯
operational businesses Experience using LLMs or AI tools to structure and extract meaning from unstructured data Experience automating workflows and deploying model pipelines (e.g. Airflow, dbt, MLFlow, or similar) Exposure to business planning, pricing, or commercial decision-making Familiarity with geospatial data Experience in fast-scaling startups or operational More ❯
Technical excellence across the Data Engineering ecosystem: We primarily work in python, go and SQL. Our code (tracked via github) deploys to GCP (Bigquery, Airflow/Composer, GKE, Cloud Run), dbt Cloud and Azure via terraform. We recognise this is a broad list; if you're not deeply familiar More ❯
Staines, Middlesex, United Kingdom Hybrid / WFH Options
Industrial and Financial Systems
Semantic Kernel, and tools such as MS tooling, Co-Pilot Studio, ML Studio, Prompt flow, Kedro, etc. Proficiency with pipeline orchestration tools, such as Airflow, Kubeflow, and Argo. Outstanding communication skills, combining subject matter expertise with a flair for statistics. A results-driven attitude, a passion for innovation, and More ❯
sync/async methods. Expertise in databases, database architecture, SQL, and stored procedures. Strong background in data integration technologies such as Azure Data Factory, ApacheAirflow, and Databricks AutoLoader. Proficiency in building CI/CD pipelines and implementing infrastructure as code. Experience with event-driven data flows and … message aggregation technologies (Event Hub, Apache Kafka). A solid understanding of current and emerging technologies and how they can deliver business value. Desirable experience with data matching processes. Strong ability to translate business strategies into technical solutions and business requirements into technical designs. If you're passionate about More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Oliver James Associates Ltd
sync/async methods. Expertise in databases, database architecture, SQL, and stored procedures. Strong background in data integration technologies such as Azure Data Factory, ApacheAirflow, and Databricks AutoLoader. Proficiency in building CI/CD pipelines and implementing infrastructure as code. Experience with event-driven data flows and … message aggregation technologies (Event Hub, Apache Kafka). A solid understanding of current and emerging technologies and how they can deliver business value. Desirable experience with data matching processes. Strong ability to translate business strategies into technical solutions and business requirements into technical designs. If you're passionate about More ❯
or the travel industry. Conducted and analysed large scale A/B experiments Experience mentoring team members Experience with workflow orchestration technologies such as Airflow, Dagster or Prefect Experience with technologies such as: Google Cloud Platform, particularly Vertex AI Docker and Kubernetes Perks of joining us: Company pension contributions More ❯
Tomcat, UNIX tools, Bash/sh, SQL, Python, Hive, Hadoop/HDFS, and Spark. Work within a modern cloud DevOps environment using Azure, Git, Airflow, Kubernetes, Helm, and Terraform. Demonstrate solid knowledge of computer hardware and network technologies. Experienced in writing and running SQL and Bash scripts to automate More ❯
london, south east england, united kingdom Hybrid / WFH Options
Kantar Media
Tomcat, UNIX tools, Bash/sh, SQL, Python, Hive, Hadoop/HDFS, and Spark. Work within a modern cloud DevOps environment using Azure, Git, Airflow, Kubernetes, Helm, and Terraform. Demonstrate solid knowledge of computer hardware and network technologies. Experienced in writing and running SQL and Bash scripts to automate More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
Pandas, numpy, scipy, scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly … GitHub Understanding of how to build and run containerized applications (Docker, Helm) Familiarity with, or a working under-standing of big data search tools (Airflow, Pyspark, Trino, OpenSearch, Elastic, etc.) Desired Skills (Optional) Docker Jenkins Hadoop/Spark Kibana Kafka NiFi ElasticSearch About The DarkStar Group Our Company The More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start More ❯
Github integration and automation). Experience with scripting languages such as Python or R. Working knowledge of message queuing and stream processing. Experience with Apache Spark or Similar Technologies. Experience with Agile and Scrum Technologies. Familiarity with dbt and Airflow is an advantage. Experience working in a start More ❯