City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Lambda, EMR) Strong communication skills and a collaborative mindset Comfortable working in Agile environments and engaging with stakeholders Bonus Skills Experience with Apache Iceberg or similar table formats (e.g., DeltaLake, Hudi) Exposure to CI/CD tools like GitHub Actions, GitLab CI, or Jenkins Familiarity with data quality frameworks such as Great Expectations or Deequ Interest in More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
across SQL Server, PostgreSQL, and cloud databases Proven track record with complex data migration projects (terabyte+ datasets, multiple legacy source systems, structures and unstructured data) Proficiency with Parquet/DeltaLake or other modern data storage formats Experience with streaming architectures using Kafka, Event Hubs, or Kinesis for real-time data processing Knowledge of data architectures supporting AI More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
models and reports. Experience required: Strong background in data engineering, warehousing, and data quality. Proficiency in Microsoft 365, Power BI, and other BI tools Familiarity with Azure Databricks and DeltaLake is desirable. Ability to work autonomously in a dynamic environment and contribute to team performance. Strong communication, influencing skills, and a positive, can-do attitude. Knowledge of More ❯
the technical lead and design authority Ability to partner with and influence senior client stakeholders to drive the programme to the required outcomes Hands on experience of Databricks including DeltaLake and Unity Catalog Experience of cloud architectures. We favour Azure and AWS. You have guided data engineers and analysts through optimising their workloads and take FinOps at More ❯
environment. What you'll be doing as the Machine Learning Operations Engineer: Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment … What we're looking for from the Machine Learning Operations Engineer: Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of DeltaLake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits for the More ❯
Employment Type: Permanent
Salary: £50000 - £70000/annum bonus, 25 days holiday + more
environment. What you'll be doing as the Machine Learning Operations Engineer: Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment … What we're looking for from the Machine Learning Operations Engineer: Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of DeltaLake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits for the More ❯
Herefordshire, West Midlands, United Kingdom Hybrid / WFH Options
IO Associates
and maintain platform software, libraries, and dependencies . Set up and manage Spark clusters , including migrations to new platforms. Manage user accounts and permissions across identity platforms. Maintain the DeltaLake and ensure platform-wide security standards. Collaborate with the wider team to advise on system design and delivery . What we're looking for: Strong Linux engineering More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯