Data Engineering Manager, Amazon Music Technology We are seeking an ambitious Data Engineering Manager to join our Metrics and Data Platform team. The Metrics and Data Platform team plays a critical role in enabling Amazon Music's business decisions and More ❯
Join us on our mission to make a better world of work. Culture Amp is the world's leading employee experience platform, revolutionizing how 25 million employees across more than 6,500 companies create a better world of work. Culture More ❯
Walsall, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Adecco
Senior Data Engineer Hybrid/remote - North-West based £65-80,000 + Bonus + Benefits Are you a data enthusiast eager to work on innovative solutions that impact millions? We're looking for an experienced Senior Data Engineer to More ❯
utilizing the Django web framework for the backends and React for developing the client facing portion of the application Create, extract, transform, and load (ETL) pipelines using Hadoop and ApacheAirflow for various production big data sources to fulfill intelligence data availability requirements Automate retrieval of data from various sources via API and direct database queries for intelligence … iterations Support capabilities briefings for military personnel Required Qualifications: Bachelor's degree in related field preferred Active TS/SCI Required Preferred Qualifications: Windows 7/10, MS Project ApacheAirflow Python, Java, JavaScript, React, Flask, HTML, CSS, SQL, R, Docker, Kubernetes, HDFS, Postgres, Linux AutoCAD JIRA, Gitlab, Confluence Also looking for a Senior Developer at a higher More ❯
utilizing the Django web framework for the backends and React for developing the client facing portion of the application Create, extract, transform, and load (ETL) pipelines using Hadoop and ApacheAirflow for various production big data sources to fulfill intelligence data availability requirements Automate retrieval of data from various sources via API and direct database queries for intelligence … for military personnel Required Qualifications: Active TS/SCI Required 7-10 years experience Preferred Qualifications: Bachelor's degree in related field preferred Windows 7/10, MS Project ApacheAirflow Python, Java, JavaScript, React, Flask, HTML, CSS, SQL, R, Docker, Kubernetes, HDFS, Postgres, Linux AutoCAD JIRA, Gitlab, Confluence About Us: IntelliBridge delivers IT strategy, cloud, cybersecurity, application More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
desirable to have a hands-on experience on these. Substantial experience using tools for statistical modelling of large data sets Some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark or other caching and analytics technologies Expertise in model training, Statistics, model evaluation, deployment and optimisation, including RAG More ❯
desirable to have a hands-on experience on these. Substantial experience using tools for statistical modelling of large data sets Some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark or other caching and analytics technologies Expertise in model training, Statistics, model evaluation, deployment and optimisation, including RAG More ❯
Reston, Virginia, United States Hybrid / WFH Options
CGI
governance. Skilled in leveraging S3, Redshift, AWS Glue, EMR, Azure Data Lake, and Power BI to deliver secure, high-performance solutions and self-service BI ecosystems. Skilled in leveraging ApacheAirflow, Apache Flink and other Data tools Experienced in distributed data compute architecture using Apache Spark and PySpark. Education: Bachelor's degree in computer science, Information More ❯
with MLOps practices and model deployment pipelines Proficient in cloud AI services (AWS SageMaker/Bedrock) Deep understanding of distributed systems and microservices architecture Expert in data pipeline platforms (Apache Kafka, Airflow, Spark) Proficient in both SQL (PostgreSQL, MySQL) and NoSQL (Elasticsearch, MongoDB) databases Strong containerization and orchestration skills (Docker, Kubernetes) Experience with infrastructure as code (Terraform, CloudFormation More ❯
PyTorch, TensorFlow, Hugging Face) Proven MLOps, big data, and backend/API development experience Deep understanding of NLP and LLMs Proficient with cloud platforms (AWS/GCP/Azure), Airflow, DBT, Docker/Kubernetes Strong collaboration, problem-solving, and coding best practices Nice to have: LLM fine-tuning, streaming data, big data warehousing, open-source contributions. More ❯
developing with Python and related ML libraries Functional programming experience a plus Willing to learn and develop in new technologies as required Experience with MongoDB Experience with MLFlow or AirFlow a plus Other tools: Maven, GIT, LINUX, Location: Customer Site, Telework Telework: 75% (In office at least one day a week More ❯
Science, or a related technical field required • 4+ years in data engineering, preferably within secure or classified environments • Strong proficiency in Python, Spark, SQL, and orchestration tools such as Airflow • Hands-on experience with classified data management, secure networking, and infrastructure performance tuning Preferred Experience • Familiarity with secure cloud environments (e.g., AWS GovCloud, C2S) • Strong troubleshooting and optimization skills More ❯
of machine learning, statistics, and data modeling Expert in Python (pandas, scikit-learn, etc.) and SQL Experience with cloud platforms (Azure, AWS, or GCP) Familiarity with tools like MLflow, Airflow, Power BI Strong stakeholder management and communication skills Fluent in Dutch (mandatory More ❯
platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Solid understanding of ML lifecycle: data handling, model development, deployment, and monitoring. Familiarity with MLOps tools such as MLflow, Airflow, DVC, or similar. Experience with version control (Git), CI/CD pipelines, and software engineering best practices. Fluent in English; knowledge of French or Dutch is a plus. Desirable More ❯
Engineering etc Software Development experience in Python or Scala An understanding of Big Data technologies such as Spark, messaging services like Kafka or RabbitMQ, and workflow management tools like Airflow SQL & NoSQL expertise, ideally including Postgres, Redis, MongoDB etc Experience with AWS, and with tools like Docker & Kubernetes As well as this you will be someone willing to take More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
as Azure Data Factory (ADF) and Python. Ensuring the quality of raw datasets to empower Data Analysts in creating robust data models. Deploying and managing data tools on Kubernetes (Airflow, Superset, RStudio Connect). Supporting Data Analytics through the management of DBT, DevOps, and deployment rules. You will have the opportunity to work end-to-end, making meaningful contributions More ❯
security standards. Required Experience: Active Secret clearance. 3-7 years in data engineering, preferably within secure, high-side environments. Proficiency in Python, Spark, SQL, and data orchestration tools (e.g., Airflow). Experience with classified data management, secure networking, and infrastructure optimization. Preferred Experience: Familiarity with IC standards (UDS, IC ITE) and secure cloud environments (AWS GovCloud, C2S). Strong More ❯
. NoSQL (e.g., MongoDB and Firestore). SQL querying (e.g., BigQuery, Snowflake), including the ability to work with complex data structures and very large data volumes. Orchestration services (e.g., Airflow, Luigi, Cloud Compose). Proactive, independent, responsible and attentive to detail. Eager and able to learn, analyse, resolve problems, and improve the standard of BVGroup data infrastructure. Degree in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
and familiarity with Git Strong communicator, eager to learn, and naturally curious Comfortable working across multiple business areas with varied responsibilities Nice-to-Haves Exposure to tools like Prefect , Airflow , or Dagster Familiarity with Azure SQL , Snowflake , or dbt Tech Stack/Tools Python SQL (on-prem + Azure SQL Data Warehouse) Git Benefits £35,000 - £40,000 starting More ❯
from you. Key Responsibilities: - Design and build high-scale systems and services to support data infrastructure and production systems. - Develop and maintain data processing pipelines using technologies such as Airflow, PySpark and Databricks. - Implement dockerized high-performance microservices and manage their deployment. - Monitor and debug backend systems and data pipelines to identify and resolve bottlenecks and failures. - Work collaboratively More ❯
with remote stakeholders Familiarity with AI development tools such as Cursor, GitHub Copilot, or Claude. BS degree in Computer Science or related engineering field Nice to Have Experience with Airflow, Celery, AWS and/or Azure, Postgres Experience with API platform development Experience with Go Ready to be part of AI transformation at Abnormal AI? Apply Now! Once you More ❯
adaptable to fast-paced startup environment, comfortable with ambiguity and evolving responsibilities Work Authorization: Must be eligible to work in US or UK Preferred Experience: Data orchestration tools (e.g. , Airflow, Prefect)Experience deploying, monitoring, and maintaining ML models in production environments (MLOps)Familiarity with big data technologies ( e.g. , Spark, Hadoop)Background in time-series analysis and forecastingExperience with data More ❯