London, England, United Kingdom Hybrid / WFH Options
Aventum Group
designs and techniques Knowledge of Data Warehouse project lifecycle, tools, technologies, and best practices Experience using Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, BigQuery, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service more »
create effective data models supporting business processes, including knowledge of dimensional modeling and normalization. Familiarity with big data technologies on Azure, such as Azure Databricks and Azure Stream Analytics. Knowledge of data security practices, cloud architecture principles, scripting languages for automation, data visualization tools, DevOps practices, machine learning frameworks, performance more »
to maintain delivery momentum We’d Love If You Also Have These: Highly proficient in SQL Experience using Python based ETL tools such as Databricks Experience using Data Ingestion tools such as FiveTran or Stitch Experience using Business Intelligence tools such as PowerBI, Looker or Tableau Experience with data pipeline more »
Governance, Data Security processes as part of the solution design About the Candidate Essential experience Experience with the following technologies Azure Synapse, Data Factory, Databricks, SQL Db, Datalake, Key Vault Azure Dev Ops and CI/CD pipelines Coding in SQL and PySpark/Python DW/Data Vault concepts more »
platforms * Google Data Products tools knowledge (e.g., BigQuery, Dataflow, DataProc, AI Building Blocks, Looker, Cloud Data Fusion, Data prep, etc.) Relevant certifications * Python * Snowflake * Databricks To apply please click the "Apply" button and follow the instructions. For a further discussion, please contact Sam Stark - (phone number removed) 83DATA is a more »
London, England, United Kingdom Hybrid / WFH Options
Version 1
Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics Direct experience in building data pipelines using Azure Data Factory and Apache Spark (preferably Databricks). Experience building data warehouse solutions using ETL/ELT tools such as SQL Server Integration Services (SSIS), Oracle Data Integrator (ODI), Talend, and Wherescape more »
efficient scrum teams Be a subject matter expert in the data platform domain, demonstrating proficiency in: Data lakes, data marts, and data warehousing (Azure Databricks, ADF, Azure Synapse, HD Insight). SQL, NoSQL, and Graph databases. Processing both structured and unstructured data, data modelling, and achieving a single customer view. more »
data requirements and deliver solutions that drive business value. Requirements: 7+ years in a Data Engineering Role Excellent proficiency in SQL, Python, Microsoft Azure, Databricks, PySpark, Experience managing a team Details: Start Date: ASAP Duration: 3 months, option for permanent extension Day rate: Up to £400Ltd, depending on experience Annual more »
SQL and Azure Data Engineering Hands-on experience in designing and developing scripts for custom ETL processes and automation in Azure Data Factory, Azure Databricks, Azure Synapse, Python, Pyspark etc. Experience being customer-facing on numerous data focused projects with a consultative approach Ability to deliver high to low-level more »
One or more of the following key technology skills: Systems integration, APIs – REST, SOAP etc Informatica SQL Server Integration Services (SSIS) Azure Data Factory Databricks/Apache Spark Amazon RedShift Azure Synapse SQL Server Oracle Database Oracle Data Integrator Oracle Integration Cloud Business Objects Data Services (BODS) Equivalent tech (useful more »
Business Intelligence products and solutions Strong ability to coach, guide and motivate junior team members Deep understanding of the Microsoft technology stack, specifically PowerBI, Databricks & Azure Cloud Proficiency in SQL, strong understanding of ETL and ELT processes Strong communication skills, ability to translate complex messages into concise, easy-to-understand more »
timely delivery. Requirements; Significant experience working as a Lead Data Engineer within Financial Services Certifications in relevant technologies, such as Azure Data Engineer or Databricks Certified Developer Experience with real-time data processing and streaming technologies like Apache Kafka or Azure Event Hubs Knowledge of data visualization tools, such as more »
with internal senior stakeholders and board members to help design business strategy. Skills & Qualifications Strong knowledge of Azure Cloud Data platform, experience working with Databricks, Azure Data Factory and SQL Server. Experience of Python and Terraform would be beneficial. Strong stakeholder management and great communication skills to build upon and more »
with internal senior stakeholders and board members to help design business strategy. Skills & Qualifications Strong knowledge of Azure Cloud Data platform, experience working with Databricks, Azure Data Factory and SQL Server. Experience of Python and Terraform would be beneficial. Strong stakeholder management and great communication skills to build upon and more »
SQL and Azure Data Engineering Hands-on experience in designing and developing scripts for custom ETL processes and automation in Azure Data Factory, Azure Databricks, Azure Synapse, Python, Pyspark etc. Experience being customer-facing on numerous data focused projects with a consultative approach Ability to deliver high to low-level more »
An industry-leading organisation are looking for an experienced Senior Data Engineer who is well-versed in Databricks, PySpark and SQL to join their growing Data Engineering team. This role can be largely remote, with some travel to their Central London Head Office, likely around 2 times per month. This … Data Engineer to mentor and lead the Data Engineering team, undertaking a wide variety of technical work. Your primary focus will be working with Databricks - you will be building data pipelines in Databricks, coding in PySpark, and supporting internal applications as they are moving away from a legacy application into … encourage you to provide thought leadership, and to innovate and experiment with new technologies to deliver benefits to the business. Requirements Excellent skills in Databricks and Pyspark Excellent experience with SQL and T-SQL programming Experience designing, developing and maintaining data warehouses and data lakes Knowledge of Azure technologies including more »
An industry-leading organisation are looking for an experienced Senior Data Engineer who is well-versed in Databricks, PySpark and SQL to join their growing Data Engineering team. This role can be largely remote, with some travel to their Central London Head Office, likely around 2 times per month. This … Data Engineer to mentor and lead the Data Engineering team, undertaking a wide variety of technical work. Your primary focus will be working with Databricks - you will be building data pipelines in Databricks, coding in PySpark, and supporting internal applications as they are moving away from a legacy application into … encourage you to provide thought leadership, and to innovate and experiment with new technologies to deliver benefits to the business. Requirements Excellent skills in Databricks and Pyspark Excellent experience with SQL and T-SQL programming Experience designing, developing and maintaining data warehouses and data lakes Knowledge of Azure technologies including more »
London, England, United Kingdom Hybrid / WFH Options
Ripple
and crystallizing them into scalable data solutions Excited about operating independently, demonstrating excellence, and learning new technologies and frameworks NICE TO HAVE: Experience in Databricks is a plus Crypto domain experience or interest WHO WE ARE: Do Your Best Work The opportunity to build in a fast-paced start-up more »
Data Mart. Utilize Vector Databases, Cosmos DB, Redis, and Elasticsearch for efficient data storage and retrieval. Demonstrate proficiency in programming languages including Python, Spark, Databricks, Pyspark, SQL, and ML Algorithms. Implement Machine Learning models and algorithms using Pyspark, Scikit Learn, and other relevant tools. Manage Azure DevOps, CI/CD … pipelines, GitHub, and Kubernetes (AKS) for efficient software development and deployment. Implement ML/OPS solutions using Azure ML, Databricks MLFlow, and LLMOps for streamlined Machine Learning operations. Ensure data governance and data cataloging standards are maintained, utilizing tools such as Collibra and Talend Data Catalogue. Collaborate with stakeholders to … Cloud environments, Azure Data Lake, Azure Data Factory, Microservices architecture. Experience with Vector Databases, Cosmos DB, Redis, Elasticsearch. Strong programming skills in Python, Spark, Databricks, Pyspark, SQL, ML Algorithms, Gen AI. Knowledge of Azure DevOps, CI/CD pipelines, GitHub, Kubernetes (AKS). Experience with ML/OPS tools such more »
Lake, Hyperscalers specially Microsoft Azure technologies across the Data Lifecycle. Have experience in architecting and solutioning Azure based projects. Hands on knowledge in ADF, Databricks, PowerBI Experience in various Azure services across Data Ingestion, Data Modelling, Data Governance, Data Quality, Reporting and Visualization and Analytics capabilities. Experience in setting up more »
ETL, SQL Database, Reporting) Knowledge of Dimension Modelling Strong SQL skills, experience in writing complex SQL queries Knowledge of Azure Data Lake, Data Factory, Databricks and Functions Essential Experience Experience in supporting traditional Data Warehouses Exposure in building Azure based data ingestion and transformation pipelines, data lake, data warehouse/ more »
What are we looking for? Essentials Experience building high quality Business Intelligence products and solutions Good understanding of the Microsoft technology stack, specifically PowerBI, Databricks & Azure Cloud Proficiency in SQL, good understanding of ETL and ELT processes Strong communication skills, ability to translate complex messages into concise, easy-to-understand more »
An excellent Data Product company have need of Data Engineers for very Databricks heavy roles to join their team. They specifically like people with Databricks Certifications, but other Cloud Data stack certs (AWS, Azure, GCP etc) are also of interest to them. These roles would support talented and driven Juniors … based in the UK). Role Overview: In this vital role, you will develop and maintain enterprise-grade software systems leveraging your expertise in Databricks, Python, Spark, R, and SQL. You will collaborate closely with our architecture team to design scalable, clean solutions that support continuous delivery and improvement. Your … automating tests, enhancing our continuous integration and delivery practices, and maintaining high standards in data processing and analytics. Responsibilities Include: Developing and maintaining robust Databricks notebooks and workflows. Working with Databricks Unity Catalog for Governance related tasks Driving the delivery of data to support new product features in line with more »