production issues. Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, ApacheAirflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes … Experience with cloud platforms like AWS, GCP, or Azure. DevOps Tools: Familiarity with containerization (Docker) and infrastructure automation tools like Terraform or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing … large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like ApacheAirflow or Temporal for managing workflows and scheduling jobs in distributed systems. Stream Processing: Experience with Apache Flink or other stream processing frameworks is a plus. Desired Skills Asynchronous Programming: Familiarity with asynchronous programming tools like Celery or asyncio. Frontend Knowledge More ❯
liverpool, north west england, united kingdom Hybrid/Remote Options
Intuita - Vacancies
Azure Storage, Medallion Architecture, and working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to ApacheAirflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure or related technologies. • Experience More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
DCS Recruitment
modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to £60,000 per annum + benefits Hybrid working (3 in office) Opportunity to lead More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid/Remote Options
Vivedia Ltd
. Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration — translating complex technical ideas into More ❯
/frameworks (pandas, numpy, sklearn, TensorFlow, PyTorch) Strong understanding and experience in implementing end-to-end ML pipelines (data, training, validation, serving) Experience with ML workflow orchestration tools (e.g., Airflow, Prefect, Kubeflow) and ML feature or data platforms (e.g., Tecton, Databricks, etc.) Experience with cloud platforms (AWS, GCP/Vertex, Azure), Docker, and Kubernetes Solid coding practices (Git, automated More ❯
patterns and core programming concepts. Experience with the core Python data stack (Pandas, NumPy, Scikit-learn, etc) developed in a commercial setting, an appreciation of pipeline orchestration frameworks (e.g., Airflow, Kubeflow Pipelines, etc), applied knowledge of statistical modelling and/or experience in implementing and supporting ML systems. Demonstrable understanding of key concepts including Python testing frameworks, CI/ More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Noir
Data Engineer - FinTech - Newcastle (Tech stack: Data Engineer, SQL, Python, AWS, Git, Airflow, Data Pipelines, Data Platforms, Programmer, Developer, Architect, Data Engineer) Our client is a trailblazer in the FinTech space, known for delivering innovative technology solutions to global financial markets. They are expanding their engineering capability in Newcastle and are looking for a talented Data Engineer to join More ❯
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Interquest
hands-on with AWS, Azure, or GCP, working in a collaborative environment that values innovation, quality, and teamwork. What you’ll bring: Solid experience with Python, SQL, Spark, and Airflow Confident working across AWS, Azure, or GCP Proven experience working in a consultancy environment — able to manage multiple clients and projects Great communication skills — able to work with both More ❯
Knutsford, Cheshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
collaborate across teams. What We're Looking For: Proven experience with BigQuery, dbt/dataform, and Tableau. Strong SQL and modern data architecture knowledge. Familiarity with orchestration tools (Prefect, Airflow), Git, CI/CD, and Python. Excellent communication and stakeholder engagement skills. Ready to make an impact? Apply now More ❯
skills We're excited if you have 7+ years of experience delivering multi tier, highly scalable, distributed web applications Experience working with Distributed computing frameworks knowledge: Hive/Hadoop, Apache Spark, Kafka, Airflow Working with programming languages Python , Java, SQL. Working on building ETL (Extraction Transformation and Loading) solution using PySpark Experience in SQL/NoSQL database design More ❯
Sunderland, Tyne and Wear, England, United Kingdom
Reed
data solutions. Provide hands-on technical guidance on data design, modelling, and integration, ensuring alignment with architectural standards. Drive the adoption of tools such as Alation, Monte Carlo, and Airflow to improve data lineage, quality, and reliability. Ensure data security, privacy, and compliance are integral to all architecture and integration designs. Act as a bridge between business and technology … Glue, Azure Blob Storage, Google BigQuery). Expertise in data modelling (Dimensional, Data Vault, Enterprise). Experience designing and implementing modern data architectures. Proficiency with integration/orchestration tools (Airflow, dbt, Glue). Strong communication and stakeholder management skills. Experience with metadata, cataloguing, and data quality tools, and knowledge of data governance and GDPR. Benefits: Opportunity to work in More ❯