to influence others Skills and Abilities Platforms & Tools Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Azure Service Bus, Power BI, DeltaLake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Azure ML is a plus Languages: Python, SQL, T-SQL More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Aventum Group
to influence others Skills and Abilities Platforms & Tools Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Azure Service Bus, Power BI, DeltaLake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Azure ML is a plus Languages: Python, SQL, T-SQL More ❯
london, south east england, united kingdom Hybrid / WFH Options
Aventum Group
to influence others Skills and Abilities Platforms & Tools Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Azure Service Bus, Power BI, DeltaLake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Azure ML is a plus Languages: Python, SQL, T-SQL More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Aventum Group
to influence others Skills and Abilities Platforms & Tools Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Azure Service Bus, Power BI, DeltaLake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Azure ML is a plus Languages: Python, SQL, T-SQL More ❯
we do Passion for data and experience working within a data driven organization Hands-on experience with architecting, implementing, and performance tuning of: Data Lake technologies (e.g. DeltaLake, Parquet, Spark, Databricks) API & Microservices Message queues, streaming technologies, and event driven architecture NoSQL databases and query languages More ❯
they scale their team and client base. Key Responsibilities: Architect and implement end-to-end, scalable data and AI solutions using the Databricks Lakehouse (DeltaLake, Unity Catalog, MLflow). Design and lead the development of modular, high-performance data pipelines using Apache Spark and PySpark. Champion the More ❯
Strong experience designing and delivering data solutions in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or More ❯
domains fully but should be able to show strong capability in their core areas: Cloud Data Platforms Azure Synapse Analytics, Microsoft Fabric, Azure Data Lake, Azure SQL Amazon Redshift, AWS Athena, AWS Glue Google BigQuery, Google Cloud Storage, Dataproc Artificial Intelligence & Machine Learning Azure OpenAI, Azure Machine Learning Studio … Foundry AWS SageMaker, Amazon Bedrock Google Vertex AI, TensorFlow, scikit-learn, Hugging Face Data Engineering & Big Data Azure Data Factory, Azure Databricks, Apache Spark, DeltaLake AWS Glue ETL, AWS EMR Google Dataflow, Apache Beam Business Intelligence & Analytics Power BI, Amazon QuickSight, Looker Studio Embedded analytics and interactive More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
london, south east england, united kingdom Hybrid / WFH Options
Undisclosed
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Undisclosed
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
Proficient with orchestration tools (e.g., Airflow, dbt, Prefect) 3. Comfortable working with cloud platforms (e.g., AWS) and tools like Snowflake 4. Familiar with data lake and warehouse architecture (e.g., S3 + Athena, DeltaLake) 5. Strong Python skills for data manipulation (e.g., pandas, pyarrow, pyspark) Data Infrastructure More ❯
Proficient with orchestration tools (e.g., Airflow, dbt, Prefect) 3. Comfortable working with cloud platforms (e.g., AWS) and tools like Snowflake 4. Familiar with data lake and warehouse architecture (e.g., S3 + Athena, DeltaLake) 5. Strong Python skills for data manipulation (e.g., pandas, pyarrow, pyspark) Data Infrastructure More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83zero Ltd
is a MUST! Key expertise and experience we're looking for: Data Engineering in Databricks - Spark programming with Scala, Python, SQL Ideally experience with DeltaLake Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance experience More ❯
or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, DeltaLake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, DeltaLake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
The Citation Group
or CloudFormation. Experience with workflow orchestration tools (e.g., Airflow, Dagster). Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, DeltaLake, Databricks Experience working in Agile environments with tools like Jira and Git. About Us We are Citation. We are far from your More ❯
performance, scalability, and security. Collaborate with business stakeholders, data engineers, and analytics teams to ensure solutions are fit for purpose. Implement and optimise Databricks DeltaLake, Medallion Architecture, and Lakehouse patterns for structured and semi-structured data. Ensure best practices in Azure networking, security, and federated data access More ❯
/ELT) Proficient with orchestration tools (e.g., Airflow, DBT, Prefect) Comfortable working with cloud platforms (e.g., AWS) and tools like Snowflake Familiar with data lake and warehouse architecture (e.g., S3 + Athena, DeltaLake) Strong Python skills for data manipulation (e.g., pandas, pyarrow, pyspark) Data Infrastructure & Management More ❯
/ELT) Proficient with orchestration tools (e.g., Airflow, DBT, Prefect) Comfortable working with cloud platforms (e.g., AWS) and tools like Snowflake Familiar with data lake and warehouse architecture (e.g., S3 + Athena, DeltaLake) Strong Python skills for data manipulation (e.g., pandas, pyarrow, pyspark) Data Infrastructure & Management More ❯
ll not only maintain and optimize our data infrastructure but also spearhead its evolution. Built predominantly on Databricks, and utilizing technologies like Pyspark and DeltaLake, our infrastructure is designed for scalability, robustness, and efficiency. You'll take charge of developing sophisticated data integrations with various advertising platforms … be doing for us Leadership in Design and Development : Lead in the architecture, development, and upkeep of our Databricks-based infrastructure, harnessing Pyspark and Delta Lake. CI/CD Pipeline Mastery : Create and manage CI/CD pipelines, ensuring automated deployments and system health monitoring. Advanced Data Integration : Develop More ❯
The candidate should have ability to create a design and implement The candidate should have understanding of using a Databricks and/or Synapse DeltaLake based Staging Layer and Business Layer for BI purposes The candidate should understand data modelling and structural design Spark development The candidate More ❯