analytics, pricing, operations, finance, and product to understand their data needs and ensure the platform meets them. Manage project timelines, risks, dependencies, and budget where appropriate. Champion DevOps/DataOps practices, CI/CD for data pipelines, and infrastructure-as-code where applicable. Stay abreast of new developments in data architecture and cloud technologies and bring innovation into the team. More ❯
Senior Data Engineer - (Azure/Databricks) page is loaded Senior Data Engineer - (Azure/Databricks) Apply locations London - Scalpel time type Full time posted on Posted 7 Days Ago job requisition id REQ05851 This is your opportunity to join AXIS More ❯
LMAX Group is looking for a highly skilled and passionate Data Operations Engineer/Developer to join our team and help build and scale our data processing infrastructure. In this role, you'll develop high-performance data pipelines, real-time More ❯
cloud platforms with error handling and reusable libraries Documenting and presenting end-to-end data processing system diagrams (C4, UML, etc.) Implementing robust DevOps practices in data projects, including DataOps tools for orchestration, data integration, and analytics Enhancing resilience through vulnerability checks and testing strategies (unit, integration, data quality) Applying SOLID, DRY, and TDD principles practically Agile methodologies such as More ❯
Wales, Yorkshire, United Kingdom Hybrid / WFH Options
Made Tech Limited
cloud platforms with error handling and reusable libraries Documenting and presenting end-to-end data processing system diagrams (C4, UML, etc.) Implementing robust DevOps practices in data projects, including DataOps tools for orchestration, data integration, and analytics Enhancing resilience through vulnerability checks and testing strategies (unit, integration, data quality) Applying SOLID, DRY, and TDD principles practically Agile methodologies such as More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Made Tech Limited
cloud platforms with error handling and reusable libraries Documenting and presenting end-to-end data processing system diagrams (C4, UML, etc.) Implementing robust DevOps practices in data projects, including DataOps tools for orchestration, data integration, and analytics Enhancing resilience through vulnerability checks and testing strategies (unit, integration, data quality) Applying SOLID, DRY, and TDD principles practically Agile methodologies such as More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Made Tech Limited
cloud platforms with error handling and reusable libraries Documenting and presenting end-to-end data processing system diagrams (C4, UML, etc.) Implementing robust DevOps practices in data projects, including DataOps tools for orchestration, data integration, and analytics Enhancing resilience through vulnerability checks and testing strategies (unit, integration, data quality) Applying SOLID, DRY, and TDD principles practically Agile methodologies such as More ❯
Data Stack, critically ADF and Synapse (experience with Microsoft Fabric is a plus) Highly developed python and data pipeline development knowledge, must include substantial PySpark experience Demonstrable DevOps and DataOps experience with an understanding of best practices for engineering, test and ongoing service delivery An understanding of Infrastructure as Code concepts (Demonstrable Terraform experience a plus) Demonstrable experience of Data More ❯
Define and document technical delivery plans, ensuring clear next steps for the current phase and future expansion of projects. Governance & Standards: Advocate and enforce best practices in data modelling, DataOps, CI/CD, and security across projects. Documentation & Knowledge Transfer: Maintain high-quality technical documentation, ensuring clarity for both engineering teams and business stakeholders. Industry Awareness: Stay informed on Microsoft More ❯
as Python Good knowledge of the Azure cloud data platform and the potential to use its services to improve analytics Good knowledge of testing BI software, release cycles, devops (dataops) and how to successfully move a product from development to production Ability to understand complex technical/technology solutions and concepts, with the ability to solve complex problems. Effective IT More ❯
and data pipelines, design & optimize analytics & relational databases, and build analytics models using DBT and bespoke aggregation engines. You'll work closely with business stakeholders, other BI Developers and DataOps as well as System engineers to support both data and application integrations using bespoke tools written in Python/Java, as well as tools such as Meltano, Airflow, Mulesoft/ More ❯
days per week£55,000-£67,000+ Excellent Benefits18 Months FTC with long term potential Your New Company Our large public sector organisation are seeking a senior Data Platform DataOps Engineer to serve as our clients first DataOps specialist in a growing team of Data Engineers and DevOps professionals.In this pivotal role, you will focus on operationalising and automating their … business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, Delta Lake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous process improvements. What you will need Demonstrable experience in DataOps, Data Engineering, DevOps, or related roles focused on managing data operations More ❯
Python , SQL , and working with SQL/NoSQL databases . Skilled in designing, building, and maintaining data pipelines , data warehouses , and leveraging AWS data services . Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications , and workflow orchestration using Apache Airflow . Familiar with ETL frameworks, and bonus experience with Big Data processing More ❯
East London, London, United Kingdom Hybrid / WFH Options
Client Server
advanced SQL and Python coding skills You have hands-on ETL/ELT experience You have a good knowledge of best practices in CI/CD, IaC (Terraform) and DataOps You have experience of working in start-ups/scaling, high growth environments You have excellent communication and stakeholder management skills What's in it for you: Salary to £140k More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
advanced SQL and Python coding skills You have hands-on ETL/ELT experience You have a good knowledge of best practices in CI/CD, IaC (Terraform) and DataOps You have experience of working in start-ups/scaling, high growth environments You have excellent communication and stakeholder management skills What's in it for you: Salary to £140k More ❯
Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 700 of the world's leading enterprises, Astronomer lets More ❯
to motivat e cross functional squads to deliver complex technical initiatives. Software development lifecycle (SDLC) - applied understanding of SDLC best practices, having delivered improvements in previous teams' SDLC and DataOps/DevOps maturity. Agile delivery - facilitat ing ceremonies , removing impediments, coordinat ing requirements refinement to ensure tasks are achievable , and driv ing a culture of iterative improvemen t. Modern data More ❯
teams can adopt. Ensure consistency, quality, and reusability across solutions. Serve as a point of accountability for technical decisions and architectural direction, while empowering product teams to execute effectively. DataOps Enablement and Optimization: Drive the adoption of modern DataOps principles to streamline engineering workflows. Partner with platform teams to establish CI/CD pipelines, observability standards that improve operational efficiency … such as AWS, Azure, or Goo g le Cloud and cloud-based data services (e.g ., AWS Redshift, Azure Synapse Analytics, Goog le Bi g Query). Experience with DataOps practices and tools, includin g CI/CD for data pipelines. Excellent leadership, communication, and interpersonal skills, with the ability to collaborate effectively with diverse teams and stakeholders. Stron g More ❯
teams can adopt. Ensure consistency, quality, and reusability across solutions. Serve as a point of accountability for technical decisions and architectural direction, while empowering product teams to execute effectively. DataOps Enablement and Optimization: Drive the adoption of modern DataOps principles to streamline engineering workflows. Partner with platform teams to establish CI/CD pipelines, observability standards that improve operational efficiency … such as AWS, Azure, or Goo g le Cloud and cloud-based data services (e.g ., AWS Redshift, Azure Synapse Analytics, Goog le Bi g Query). Experience with DataOps practices and tools, includin g CI/CD for data pipelines. Excellent leadership, communication, and interpersonal skills, with the ability to collaborate effectively with diverse teams and stakeholders. Stron g More ❯
you will be developing and deploying the next version of Contilio's cutting-edge 3D AI platform. Working in an agile manner, you will collaborate closely with the AI, dataops andproduct teams toimplement customer requirements and delight our globalenterprisecustomers. We offer competitive compensation andequity ownership, as well asthe opportunityto take on more ownershipas we grow our team and global footprint. More ❯
document all aspects of Netcompany focused solutions and so requires a broad technical understanding across Data & AI, Cloud, Solution Architecture, Data Engineering/Science/Governance and DevOps or DataOps, leveraging our capabilities and expertise in each of these areas to develop appropriate solutions. The role is required to be able to effectively communicate the solutions to both business stakeholders More ❯
Data Engineer - STR - London, UK Job Description OVERVIEW CoStar Group (NASDAQ: CSGP) is a leading global provider of commercial and residential real estate information, analytics, and online marketplaces. Included in the S&P 500 Index and the NASDAQ 100, CoStar More ❯
the office (Victoria, London)/3 day remote Start - ASAP, need to have the ability to start by 8th August Spec - PURPOSE OF POST: Load Testing Plan and Execution DataOps practices (CI/CD for schema) Query tuning & execution plan analysis Governance, auditing, and compliance enforcement Working with developers to write scalable SQL Proactive anomaly detection using Query Store or More ❯
Astronomer empowers data teams to bring mission-critical software, analytics, and AI to life and is the company behind Astro, the industry-leading unified DataOps platform powered by Apache Airflow. Astro accelerates building reliable data products that unlock insights, unleash AI value, and powers data-driven applications. Trusted by more than 700 of the world's leading enterprises, Astronomer lets More ❯
and fit for purpose. Responsibilities Assist the technical lead in driving solutions to complex problems Build new and enhance existing services and their components Contribute to improving development and DataOps experience Look for ways to help others, whether this be through mentoring, pair programming or other approach to knowledge sharing Skills & Experience Advance skills in Python Experience of using ORM More ❯