data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Good knowledge of Databricks, Snowflake, Azure/AWS/Oracle cloud, R, Python. Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being More ❯
languages such as Python or R, with extensive experience with LLMs, ML algorithms, and models. Experience with cloud services like Azure ML Studio, Azure Functions, Azure Pipelines, MLflow, Azure Databricks, etc., is a plus. Experience working in Azure/Microsoft environments is considered a real plus. Proven understanding of data science methods for analyzing and making sense of research data More ❯
be learning how to improving efficiencies and quality of data provided, creating dashboards and handling errors through alerts You will be working with tools and platforms including:, Azure,GCP, DataBricks, BigQuery, You will learn about building and working with CI/CD pipelines using Azure DevOps You will learn a key understanding of best practises within data; Code Review, Documentation More ❯
SKILLS) Bachelor's or Master's degree in Computer Science, Engineering, or relevant experience hands-on with data engineering Strong hands-on knowledge of data platforms and tools, including Databricks, Spark, and SQL Experience designing and implementing data pipelines and ETL processes Good knowledge of ML ops principles and best practices to deploy, monitor and maintain machine learning models in More ❯
Global Data & AI area. We leverage scalable data platforms, MLOps, reference architectures, and modular components across the entire data and ML lifecycle. We are cloud-native, primarily working with Databricks, AWS and MS Azure. We embrace agility, diverse thinking, and continuous learning, fostering a culture of feedback, honesty, and fun. Please note that if you are NOT a passport holder More ❯
manipulation and analytical skills in languages such as Python and SQL. Knowledge of modern visualization tools such as PowerBI is a plus. Familiarity with cloud-based platforms such as Databricks, Snowflake and Azure is an advantage, but not essential. Effective task/project management and general organization skills. Excellent verbal and written communications skills; ability to convey complex concepts to More ❯
be capable of providing strong technical leadership to bring the roadmap to life. This should be in addition to a strong core foundation in data technology with exposure to DataBricks highly valued. The role is responsible for guiding the development and strategy for technology, recently onboarded into the business, as well as the data assets that are created by or More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
KPMG United Kingdom
learning and Data science applications Ability to use wide variety of open-source technologies Knowledge and experience using at least one Data Platform Technology such as Quantexa, Palantir and DataBricks Knowledge of test automation frameworks and ability to automate testing within the pipeline To discuss this or wider Technology roles with our recruitment team, all you need to do is More ❯
and high-performance applications in production environments. Prepare to be tenacious and collaborate to drive the right solutions forward. Have a good understanding of Microsoft Azure tooling including Azure Databricks, preferably with hands-on experience. Strong understanding of data modelling and architectures (e.g. data vaults, data warehousing, data lakes etc.) Experience with Azure based solutions such as Azure Data Factory. More ❯
Swansea, Wales, United Kingdom Hybrid / WFH Options
Driver and Vehicle Licensing Agency (DVLA)
strong technical understanding and interest in engineering reliable and maintainable data products. You will be experienced in a range of technologies including: Python & SQL languages Distributed data processing (e.g. Databricks, Apache Spark, AWS Glue) Data integration techniques (e.g. API, ELT/ETL) Decentralised version control systems (e.g. git, mercurial) Testing You will also possess programming and build skills with the More ❯
and high-performance applications in production environments. Prepare to be tenacious and collaborate to drive the right solutions forward. Have a good understanding of Microsoft Azure tooling including Azure Databricks, preferably with hands-on experience. Strong understanding of data modelling and architectures (e.g. data vaults, data warehousing, data lakes etc.) Experience with Azure based solutions such as Azure Data Factory. More ❯
concepts, and ETL processes. Working proficiency in at least one scripting language (e.g., Python, JavaScript) is highly preferred. Hands-on experience with modern cloud data platforms such as Snowflake, Databricks, Amazon Redshift, Google BigQuery or similar. Familiarity with Business Intelligence and data visualization tools (e.g., ThoughtSpot, Tableau, Power BI, Looker, MicroStrategy). Exceptional ability to quickly grasp customer needs and More ❯
e.g., Pandas, NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization Familiarity with graph databases (e.g., Neo4j, Memgraph) or search platforms (e.g., Elasticsearch, OpenSearch) to support complex data relationships and querying More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
KPMG UK
learning and Data science applications Ability to use wide variety of open-source technologies Knowledge and experience using at least one Data Platform Technology such as Quantexa, Palantir and DataBricks Knowledge of test automation frameworks and ability to automate testing within the pipeline To discuss this or wider Technology roles with our recruitment team, all you need to do is More ❯
programming language Terraform on Spacelift for our infrastructure definition and deployment Kubernetes for data services and task orchestration Airflow for job scheduling and tracking Circle CI for continuous deployment Databricks for our data lake platform Parquet and Delta file formats on S3 for data lake storage Postgres/aurora for our relational databases Spark for data processing dbt for data More ❯
. Implement robust data management & governance strategies for long-term sustainability. Expertise in a cloud data platforms (AWS, Azure, GCP) and leading modern data warehouses (BigQuery, Redshift, Synapse, Snowflake, Databricks). Experience with big data processing frameworks (Apache Spark, Flink). Advise public service clients on data platform modernization strategies. Work closely with business, engineering, and policy teams to align More ❯
London, England, United Kingdom Hybrid / WFH Options
NTT DATA
data streaming and event-driven architectures, with a focus on Kafka and Confluent platforms. In-depth knowledge in architecting and implementing data lakes and lakehouse platforms, including experience with Databricks and Unity Catalog. Proficiency in conceptualising and applying Data Mesh and Data Fabric architectural patterns. Experience in developing data product strategies, with a strong inclination towards a product-led approach More ❯
involvement to resolve and close tickets timely. Skilled in troubleshooting, preferably with experience of working as a support consultant in an ITIL environment. Advantageous Qualifications and Skills Additional Skills: Databricks, Python, .NET Framework (C# or VB.NET) IBM Cognos BI or IBM Planning Analytics Azure Fundamentals (AZ-900), Data Fundamentals (DP-900), Azure Data Engineer Associate (DP-203), Power BI Data More ❯
London, England, United Kingdom Hybrid / WFH Options
NTT DATA
data streaming and event-driven architectures, with a focus on Kafka and Confluent platforms. In-depth knowledge in architecting and implementing data lakes and lakehouse platforms, including experience with Databricks and Unity Catalog. Proficiency in conceptualising and applying Data Mesh and Data Fabric architectural patterns. Experience in developing data product strategies, with a strong inclination towards a product-led approach More ❯
Middlesbrough, Yorkshire, United Kingdom Hybrid / WFH Options
Causeway Technologies
best practice. Desirable Exposure to data modelling, data quality, data warehousing, data governance and security best practices. Exposure to Python for data engineering and data platforms like MS Fabric, Databricks, etc Public cloud provider experience(AWS, Azure, GCP) What you get from us: If you're looking to build an exceptional career with an award-winning company you've come More ❯
London, England, United Kingdom Hybrid / WFH Options
Trimble
RAG frameworks: Use techniques such as chunking, hybrid search, query translation, similarity search, vector DBs, evaluation metrics, and ANN algorithms. Monitoring performance: Using observability services such as Datadog and Databricks for LLM Observability and analytics. Keep track of latest research: Given that this is a fast evolving field, it’s important to keep track of the latest advancements in fine More ❯
London, England, United Kingdom Hybrid / WFH Options
nCino, Inc
role. Experience with Agile/Scrum Framework. Excellent problem-solving and analytical skills. Excellent communication skills, both at a deep technical level and stakeholder level. Data Expert experience with Databricks (PySpark). Experience building and maintaining complex ETL Projects, end-to-end (ingestion, processing, storage). Expert knowledge and experience with data modelling, data access, and data storage techniques. Experience More ❯
uncover opportunities, and recommend actions that improve business performance and cost efficiency. Develop and Maintain Strategic Data Models: Build scalable, well-documented data models using tools like dbt and databricks to support transparency and accuracy in tracking core KPIs. Ensure models are robust, maintainable, and aligned with evolving business needs. Deep Dive into Company Performance: Collaborate closely with cross-functional More ❯
London, England, United Kingdom Hybrid / WFH Options
FSP
years. More information on SC Clearance checks can be found here Responsibilities Designing cloud-based data platforms and data solutions using a wide range of Azure Services including Azure Databricks, Azure SQL Database, Azure Data Lake Storage, Azure Synapse, Azure Cosmos DB, Azure Data Factory and Azure Blob storage Building conceptual, logical and physical data models optimised for analytical use More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a cloud environment and integrate More ❯