Proven ability to develop data pipelines (ETL/ELT). Strong inclination to learn and adapt to new technologies and languages. Strong understanding and experience in working with Databricks Delta Lake. Proficiency in Microsoft Azure cloud technologies Strong inclination to learn and adapt to new technologies and languages. What will be your key responsibilities? Collaborate in hands-on development … to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, DeltaLake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the More ❯
not expected to cover all domains fully but should be able to show strong capability in their core areas: Cloud Data Platforms Azure Synapse Analytics, Microsoft Fabric, Azure Data Lake, Azure SQL Amazon Redshift, AWS Athena, AWS Glue Google BigQuery, Google Cloud Storage, Dataproc Artificial Intelligence & Machine Learning Azure OpenAI, Azure Machine Learning Studio, Azure AI Foundry AWS SageMaker … Amazon Bedrock Google Vertex AI, TensorFlow, scikit-learn, Hugging Face Data Engineering & Big Data Azure Data Factory, Azure Databricks, Apache Spark, DeltaLake AWS Glue ETL, AWS EMR Google Dataflow, Apache Beam Business Intelligence & Analytics Power BI, Amazon QuickSight, Looker Studio Embedded analytics and interactive dashboarding solutions Cloud Architecture Azure App Services, Virtual Machines, Functions, Kubernetes AWS EC2 More ❯
with the ability to explain complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of regulatory data requirements such as More ❯
data scientists, IT teams, and business leaders across Europe. What We Need From You 5+ years in Azure Data Engineering Deep knowledge of Azure Data Factory, Databricks, Python, SQL, DeltaLake Skilled in data modeling, medallion architecture, and integration via APIs, Event Hubs, ODBC, and sFTP Azure certification? Big plus. Power BI and ML experience? Even better. Fluent More ❯
At least 10 years' experience in Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation. Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & Batch Processing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self … stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks DeltaLake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine More ❯
least 10 years' experience in Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & Batch Processing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self … stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks DeltaLake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
with IaC tools like Terraform or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, DeltaLake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your average service provider. Our colleagues More ❯
data platforms (Azure preferred) Partnering closely with data analysts, quants, and tech teams Tech you’ll be using: Python (strong hands-on coding essential) Databricks/Spark Azure Data Lake/DeltaLake SQL/ETL frameworks CI/CD tools and version control (Git, Azure DevOps) What we’re looking for: Strong commercial experience as a More ❯
data platforms (Azure preferred) Partnering closely with data analysts, quants, and tech teams Tech you’ll be using: Python (strong hands-on coding essential) Databricks/Spark Azure Data Lake/DeltaLake SQL/ETL frameworks CI/CD tools and version control (Git, Azure DevOps) What we’re looking for: Strong commercial experience as a More ❯
data platforms (Azure preferred) Partnering closely with data analysts, quants, and tech teams Tech you’ll be using: Python (strong hands-on coding essential) Databricks/Spark Azure Data Lake/DeltaLake SQL/ETL frameworks CI/CD tools and version control (Git, Azure DevOps) What we’re looking for: Strong commercial experience as a More ❯
spoke data architectures , optimising for performance, scalability, and security. Collaborate with business stakeholders, data engineers, and analytics teams to ensure solutions are fit for purpose. Implement and optimise Databricks DeltaLake, Medallion Architecture, and Lakehouse patterns for structured and semi-structured data. Ensure best practices in Azure networking, security, and federated data access . Key Skills & Experience 5+ More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Bowerford Associates
Degree in Computer Science, Software Engineering, or similar (applied to Data/Data Specialisation). Extensive experience in Data Engineering, in both Cloud & On-Prem, Big Data and Data Lake environments. Expert knowledge in data technologies, data transformation tools, data governance techniques. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability … monitoring/security is necessary. Significant AWS or Azure hands-on experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, DeltaLake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. The role comes with an extensive benefits package including a good pension … role. KEYWORDS Lead Data Engineer, Senior Lead Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, On-Prem, Cloud, ETL, Azure Data Fabric, ADF, Databricks, Azure Data, DeltaLake, Data Lake. Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this position. More ❯
Employment Type: Permanent
Salary: £75000 - £80000/annum Pension, Good Holiday, Healthcare
with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes Experience in AWS cloud platforms Experience and certified as AWS admin Experience with Python or SQL Experience with DeltaLake Experience in Dataiku Understanding of DevOps principles and practices About the role Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load More ❯
with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes Experience in AWS cloud platforms Experience and certified as AWS admin Experience with Python or SQL Experience with DeltaLake Experience in Dataiku Understanding of DevOps principles and practices More ❯
Data Platform and Services, you'll not only maintain and optimize our data infrastructure but also spearhead its evolution. Built predominantly on Databricks, and utilizing technologies like Pyspark and DeltaLake, our infrastructure is designed for scalability, robustness, and efficiency. You'll take charge of developing sophisticated data integrations with various advertising platforms, empowering our teams with data … decision-making What you'll be doing for us Leadership in Design and Development : Lead in the architecture, development, and upkeep of our Databricks-based infrastructure, harnessing Pyspark and Delta Lake. CI/CD Pipeline Mastery : Create and manage CI/CD pipelines, ensuring automated deployments and system health monitoring. Advanced Data Integration : Develop sophisticated strategies for integrating data More ❯
. Understanding of data governance, privacy, and security practices. Strong problem-solving skills and a collaborative mindset. Fluent in English; Dutch is a plus. Nice to Have Experience with DeltaLake and Medallion architecture. Knowledge of the financial services or pensions industry. Familiarity with DevOps practices and containerization (Docker/Kubernetes). No terminology in this advert is More ❯
aligned with business objectives. Nice to Have Experience supporting AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like DeltaLake, LakeFS, or Databricks. Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks More ❯
aligned with business objectives. Nice to Have Experience supporting AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like DeltaLake, LakeFS, or Databricks. Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks More ❯
aligned with business objectives. Nice to Have Experience supporting AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like DeltaLake, LakeFS, or Databricks. Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks More ❯
aligned with business objectives. Nice to Have Experience supporting AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like DeltaLake, LakeFS, or Databricks. Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks More ❯
aligned with business objectives. Nice to Have Experience supporting AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like DeltaLake, LakeFS, or Databricks. Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks More ❯
aligned with business objectives. Nice to Have Experience supporting AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like DeltaLake, LakeFS, or Databricks. Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks More ❯
aligned with business objectives. Nice to Have Experience supporting AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like DeltaLake, LakeFS, or Databricks. Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks More ❯
aligned with business objectives. Nice to Have Experience supporting AI/ML model training infrastructure (e.g., GPU orchestration, model serving) for both Diffusion- and LLM pipelines. Familiarity with data lake architectures and tools like DeltaLake, LakeFS, or Databricks. Knowledge of security and compliance best practices (e.g., SOC2, ISO 27001). Exposure to MLOps platforms or frameworks More ❯