trusted, reliable and available . The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, Tableau, DBT, Talend, Collibra, Kafka/Confluent , Astronomer/Airflow , and Kubernetes . This forms part of a longer-term strategic direction to implement Data Mesh, and with it establish shared platform s that enables a connected collection of … and driv ing a culture of iterative improvemen t. Modern data stack - hands-on deploy ment and govern ance of enterprise technologies at scale (e.g. Snowflake, Tableau, DBT, Fivetran , Airflow, AWS , GitHub, Terraform, etc ) for self-service workloads . Thought leadership and influencing - deep interest in data platforms landscape to build well-articulated proposals that are supported by strong More ❯
and meeting deadlines. Proficiency in SQL (BigQuery), Python, Git/GitHub, and preferably Looker (Tableau or PowerBI are acceptable as well) Above average knowledge of DBT, Docker, GCP, and Airflow Experience in the cryptocurrency industry, fintech sector, or platform-type businesses is preferred but not required. Personal Attributes Analytical mindset with a passion for data-driven decision-making. Strong … ambitious with a results-oriented attitude and continuous improvement mindset Technologies you will work with Python SQL (BigQuery) GCP EPPO for experimentation DBT, Docker, Cloud Run/Kubernetes, and Airflow for data orchestration and data pipelines Looker data visualization Git and GitHub for code collaboration Ability to leverage AI tools such as Cursor and LLMs in the day-to More ❯
data accessibility and quality. Key Skills & Experience Required – About You: Essential: Strong proficiency in Python, SQL, and Jinja Experience with cloud platforms (preferably AWS) Familiarity with orchestration tools (ideally Airflow) Experience building CI/CD workflows (ideally using GitHub Actions) Up-to-date knowledge of modern data engineering practices Proven ability to design and develop scalable data pipelines Strong More ❯
analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong More ❯
West London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
more. Technical Requirements: Advanced proficiency in Python and modern software engineering practices. Experience architecting solutions using major cloud platforms (Azure, AWS, GCP). Familiarity with technologies such as Databricks, Airflow, dbt, Snowflake, GitHub CI/CD, and infrastructure-as-code. Strong background across at least several of the following areas: Cloud Engineering, Data Platform Architecture, DevOps, MLOps/LLMOps. More ❯
degree in Computer Science, Engineering, or equivalent practical experience Desirable: Experience with LLMs or transformer-based architectures (e.g., OpenAI, Mistral, or custom-trained models) Familiarity with tools such as Airflow, Spark, or Dask for large-scale data processing Awareness of AI ethics, data privacy, and legal considerations in high-stakes environments People Source Consulting Ltd is acting as an More ❯
skills and grow their careers. Deep experience with the full software development lifecycle (SDLC), including design, coding, review, source control, testing, deployment, and operations. Nice to Have: Experience with Apache Kafka and streaming frameworks like Flink. Familiarity with observability principles such as logging, monitoring, and tracing. Experience with web scraping and information extraction technologies. More ❯
Hackajob, Welcome to The Jungle) and ATS platforms (Screenloop experience a plus). Solid understanding of tech stacks including Python, React.js, AWS/Azure, and data tools like dbt, Airflow, Snowflake. Ability to conduct structured interviews and technical assessments. Familiarity with software development practices, agile methodologies, DevOps culture, and AI/ML concepts. Exceptional communication and stakeholder management skills More ❯
drug discovery, combining quantum-inspired physics with generative models Real-World Impact : Every feature shipped helps scientists prioritize molecules and design better candidates, faster Modern Stack & Challenges : Python, FastAPI, Airflow, Snowflake, Kubernetes, ML workflows, scientific infra, data engineering at scale High Ownership, High Impact : Engineers contribute to architecture, tooling, and scientific decision-making Interdisciplinary Team : Collaborate with chemists, physicists More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
experience in data engineering roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of effective use of AI tooling in More ❯
stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, LangChain/LangGraph, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for-1 share purchase plans, an EV Scheme to further reduce More ❯
stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, LangChain/LangGraph, TensorFlow, etc...) PySpark AWS cloud infrastructure: EMR, ECS, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy, 2-for-1 share purchase plans, an EV Scheme to further reduce More ❯
skills, with the ability to work cross-functionally in an Agile environment Exposure to data product management principles (SLAs, contracts, ownership models) Familiarity with orchestration tools and observability platforms (Airflow, dbt, Monte Carlo, etc.) Exposure to real-time/streaming pipelines Understanding of information security best practices Familiarity with BI tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or More ❯
Our technology stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, TensorFlow, etc ) PySpark AWS cloud infrastructure: EMR, ECS, S3, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow, Jenkins On call statement: Please be aware that our Machine Learning Engineers are required to be a part of the technology on-call rota. More details on how More ❯
Our technology stack Python and associated ML/DS libraries (Scikit-learn, Numpy, LightlGBM, Pandas, TensorFlow, etc...) PySpark AWS cloud infrastructure: EMR, ECS, S3, Athena, etc. MLOps: Terraform, Docker, Airflow, MLFlow, Jenkins On call statement: Please be aware that our Machine Learning Engineers are required to be a part of the technology on-call rota. More details on how More ❯
have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control tools such as Git You will have exposure to Python for data or analytics engineering tasks (preferred) You will demonstrate excellent problem More ❯
have advanced SQL skills; experience with dbt and/or Looker is strongly preferred You will be proficient with modern data platforms (e.g., Snowflake, dbt, AWS, GCP, Looker, Tableau, Airflow) You will have experience with version control tools such as Git You will have exposure to Python for data or analytics engineering tasks (preferred) You will demonstrate excellent problem More ❯
Data Engineering Manager, Amazon Music Technology We are seeking an ambitious Data Engineering Manager to join our Metrics and Data Platform team. The Metrics and Data Platform team plays a critical role in enabling Amazon Music's business decisions and More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … Dataflow/Airflow/ADF etc. Excellent consulting experience and ability to design and build solutions, actively contribute to RfP response. Ability to be a SPOC for all technical discussions across industry groups. Excellent design experience, with entrepreneurship skills to own and lead solutions for clients Excellent ETL skills, Data Modeling Skills Excellent communication skills Ability to define the … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯