production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, ApacheAirflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes … Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing … large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge More ❯
robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, ApacheAirflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
at least one cloud data platform (e.g. AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, ApacheAirflow Good knowledge of stream and batch processing solutions like Apache Flink, Apache Kafka Good knowledge of log management, monitoring, and analytics solutions More ❯
at least one cloud data platform (e.g. AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, ApacheAirflow etc Good knowledge of stream and batch processing solutions like Apache Flink, Apache Kafka Good knowledge of log management, monitoring, and analytics More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
monitor machine learning models for anomaly detection and failure prediction. Analyze sensor data and operational logs to support predictive maintenance strategies. Develop and maintain data pipelines using tools like ApacheAirflow for efficient workflows. Use MLflow for experiment tracking, model versioning, and deployment management. Contribute to data cleaning, feature engineering, and model evaluation processes. Collaborate with engineers and … science libraries (Pandas, Scikit-learn, etc.). Solid understanding of machine learning concepts and algorithms . Interest in working with real-world industrial or sensor data . Exposure to ApacheAirflow and/or MLflow (through coursework or experience) is a plus. A proactive, analytical mindset with a willingness to learn and collaborate. Why Join Us Work on More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git ● Knowledge of latest data pipeline orchestration tools such as Airflow ● Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation). ● Familiarity with data quality, data governance, and observability tools (e.g., Great More ❯
of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git ● Knowledge of latest data pipeline orchestration tools such as Airflow ● Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation). ● Familiarity with data quality, data governance, and observability tools (e.g., Great More ❯
in C# and Python with additional skills in Java, JavaScript/Typescript, Angular, very strong SQL, Windows server, UNIX, and .Net. Strong research skills. Strong experience of Terraform, AWS, Airflow, Docker, Github/Github actions, Jenkins/Teamcity• Strong AWS specific skills for Athena, Lambda, ECS, ECR, S3 and IAM Strong knowledge in industry best practices in development and More ❯
move us towards our vision of scaling up through product led growth. This role will be focused on our backend system (Symfony, PHP) and our data products (BigQuery, DBT, Airflow), but there will be opportunities to work across the platform including, agentic AI (Python, Langchain), frontend (React, TypeScript), the APIs (GraphQL, REST), our integration tool of choice (Tray.ai) and More ❯
and Responsibilities While in this position your duties may include but are not limited to: Support the design, development, and maintenance of scalable data pipelines using tools such as ApacheAirflow, dbt, or Azure Data Factory. Learn how to ingest, transform, and load data from a variety of sources, including APIs, databases, and flat files. Assist in the More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
Knowledge of Azure Storage, Medallion Architecture, and data formats such as JSON, CSV, and Parquet. Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. Exposure to ApacheAirflow and DBT is a bonus. Familiarity with agile principles and practices. Experience with Azure DevOps pipelines. The "Nice to Haves" Certification in Azure or related technologies. Experience More ❯
or may consider with an exposure to credit, rates, equities, options, etc. Experience with market data providers (Bloomberg, Refinitiv, etc.) would be useful Any familiarity with tools such as Airflow, prefect, or other orchestration frameworks would be advantageous. Experience building internal tools or dashboards using Dash, Streamlit, or similar web-based data analytics platforms would be nice to have More ❯
the chance to work on cutting-edge solutions that make a real impact. Key Responsibilities * Data Engineering: Design and implement data pipelines, lakes, and warehouses using tools like Spark, Airflow, or dbt. * API & Microservices Development: Build secure, efficient APIs and microservices for data integration. * Full Stack Development: Deliver responsive, high-performance web applications using React (essential), plus Angular or More ❯
the chance to work on cutting-edge solutions that make a real impact. Key Responsibilities Data Engineering: Design and implement data pipelines, lakes, and warehouses using tools like Spark, Airflow, or dbt. API & Microservices Development: Build secure, efficient APIs and microservices for data integration. Full Stack Development: Deliver responsive, high-performance web applications using React (essential), plus Angular or More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Adecco
you the chance to work on cutting-edge solutions that make a real impact.Key Responsibilities* Data Engineering: Design and implement data pipelines, lakes, and warehouses using tools like Spark, Airflow, or dbt.* API & Microservices Development: Build secure, efficient APIs and microservices for data integration.* Full Stack Development: Deliver responsive, high-performance web applications using React (essential), plus Angular or More ❯
familiarity with automated testing or CI/CD pipelines Bachelor's degree in Computer Science, Software Engineering, or a related field Bonus Skills Knowledge of data orchestration tools (e.g. Airflow) Background in Java or distributed data systems Prior exposure to large-scale infrastructure in data-heavy environments (e.g. trading, analytics, or research) Master's degree in a technical discipline More ❯
familiarity with automated testing or CI/CD pipelines Bachelor's degree in Computer Science, Software Engineering, or a related field Bonus Skills Knowledge of data orchestration tools (e.g. Airflow) Background in Java or distributed data systems Prior exposure to large-scale infrastructure in data-heavy environments (e.g. trading, analytics, or research) Master’s degree in a technical discipline More ❯
familiarity with automated testing or CI/CD pipelines Bachelor's degree in Computer Science, Software Engineering, or a related field Bonus Skills Knowledge of data orchestration tools (e.g. Airflow) Background in Java or distributed data systems Prior exposure to large-scale infrastructure in data-heavy environments (e.g. trading, analytics, or research) Master’s degree in a technical discipline More ❯
familiarity with automated testing or CI/CD pipelines Bachelor's degree in Computer Science, Software Engineering, or a related field Bonus Skills Knowledge of data orchestration tools (e.g. Airflow) Background in Java or distributed data systems Prior exposure to large-scale infrastructure in data-heavy environments (e.g. trading, analytics, or research) Master’s degree in a technical discipline More ❯
london (city of london), south east england, united kingdom
Xcede
familiarity with automated testing or CI/CD pipelines Bachelor's degree in Computer Science, Software Engineering, or a related field Bonus Skills Knowledge of data orchestration tools (e.g. Airflow) Background in Java or distributed data systems Prior exposure to large-scale infrastructure in data-heavy environments (e.g. trading, analytics, or research) Master’s degree in a technical discipline More ❯
Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
data modeling (star schema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or Google BigQuery/Cloud Storage ). Exposure to workflow orchestration … tools ( ApacheAirflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯