ETL Development, or Database Administration. ● Prior experience working with business intelligence, analytics, or machine learning teams. ● Experience in cloud-native data solutions and real-time data processing. ● Proficiency in Databricks, Python, SQL (for ETL & data transformation). ● Knowledge of GDPR, data security best practices, and access control policies ● Strong problem-solving and analytical skills to optimise data processes. ● Excellent collaboration More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Proven experience in a Data Engineer role with a successful track record Proficiency in SQL and experience with database administration Mastery of tools such as Power BI, Data Factory, Databricks, SQL Server, and Oracle Familiarity with acting as a data engineer within cloud environments like Azure. Strong analytical, problem-solving, and attention-to-detail skills Effective communication and teamwork skills More ❯
Skills/Must have: Extensive hands-on experience as a Data Engineer working across both on-prem and cloud data platforms. Strong expertise in Azure Data Factory (ADF), Snowflake, Databricks, and Python. Proficiency in SQL and debugging data workflows across hybrid environments. Deep understanding of data modelling, warehousing concepts, and performance tuning. Excellent communication and collaboration skills. Salary: £500 per More ❯
awareness, ideally in a regulated environment Ability to work independently, ask the right questions, and challenge where needed Desirable Skills Experience with Azure tools such as Synapse, DevOps or Databricks Knowledge of Python or PySpark for data transformation Familiarity with CI/CD pipelines and Power BI deployment processes Experience mentoring junior team members or supporting wider data teams Background More ❯
dashboards and reporting solutions for effective decision making. Familiarity with data product development methodologies and data monetisation strategies. Experience with one or more cloud data platforms such as Snowflake, Databricks, or cloud-native data services in Azure, AWS, GCP. Demonstrable experience in leading or delivering large data migrations and/or transformation programmes. Key Consulting Competencies Exceptional interpersonal skills spanning … Qualifications: Sector Experience – Significant experience in one or more of the following sectors: Banking, Wealth & Asset Management, Insurance, Healthcare & Lifesciences, Energy & Utilities, Public Sector. Data Platforms – Certifications in Snowflake, Databricks, Microsoft Fabric, or other industry-standard data platforms. Cloud Data Services – Certifications such as AWS Data Analytics Specialty, Azure Data Engineer, Google Professional Data Engineer. Additional Information Our commitment : Wavestone More ❯
awareness, ideally in a regulated environment Ability to work independently, ask the right questions, and challenge where needed Desirable Skills Experience with Azure tools such as Synapse, DevOps or Databricks Knowledge of Python or PySpark for data transformation Familiarity with CI/CD pipelines and Power BI deployment processes Experience mentoring junior team members or supporting wider data teams Background More ❯
City of London, London, United Kingdom Hybrid / WFH Options
X4 Technology
ETRM user interface using ClassEvents (C#.NET) Modifying the ETRM data model directly in the SQLServer database Quickly acquiring new technical skills (Azure IaaS/PaaS, Crystal Reports, Python, Azure Databricks, Azure Data Factory, Power BI, etc.) Requirements of the Contract C# .NET Developer (Commodities) Proven experience within commodity trading houses Strong ETRM domain knowledge and experience Outstanding C#.NET development expertise More ❯
ETRM user interface using ClassEvents (C#.NET) Modifying the ETRM data model directly in the SQLServer database Quickly acquiring new technical skills (Azure IaaS/PaaS, Crystal Reports, Python, Azure Databricks, Azure Data Factory, Power BI, etc.) Requirements of the Contract C# .NET Developer (Commodities) Proven experience within commodity trading houses Strong ETRM domain knowledge and experience Outstanding C#.NET development expertise More ❯
Wythenshawe, Manchester, Lancashire, England, United Kingdom
Woods & Wood Recruitment Ltd
functional projects, with a strong background in data engineering, analytics, or architecture. Proficiency in SQL and Python is essential, along with expertise in Microsoft Azure (Data Lakes, Blob Storage), Databricks, and modern data tooling. Strong communication, leadership, and problem-solving skills are vital. Desirable: Microsoft Azure Data Engineer Associate certification and knowledge of data governance, security, and compliance. Hybrid Role More ❯
the drive, aptitude, and passion, we'll help you get there. What's the Opportunity? We're recruiting for entry-level roles in: Junior Data Engineering - work with Azure, Databricks, Python, and Power BI Junior DevOps Engineering - get hands-on with CI/CD, cloud platforms, automation tools, and modern engineering practices How It Works: Apply online - we'll ask More ❯
semantic layer across all organizational data; Modeling data at both logical and physical levels based on conceptual business needs; Building data pipelines and lakehouses on modern platforms (e.g. Azure, Databricks, Snowflake); Ensuring data quality, clear documentation, and strong governance practices; Managing access to data environments in line with internal data policies; Collaborating closely with business and IT stakeholders; Reporting to More ❯
data models, analytics frameworks, and providing technical guidance. Strong programming skills in Python/PySpark and SQL. Experience building data lake and data warehouse solutions on cloud platforms like Databricks on Azure, with a DaaS model. Knowledge of data lake and warehouse concepts, architectural patterns, coding standards, version control, and CI/CD practices. Experience with ETL/ELT frameworks More ❯
development and integration. Expertise in data pipeline design and flow management. Practical experience with Kafka or RabbitMQ . Familiarity with microservices architecture and cloud-based data tools (e.g., Azure Databricks ). Understanding of data storage , performance optimisation, and security. Experience working in an agile environment and collaborating with cross-functional teams. Fluency in English , with French or Dutch a bonus. More ❯
ETL Development, or Database Administration. Prior experience working with business intelligence, analytics, or machine learning teams. Experience in cloud-native data solutions and real-time data processing. Proficiency in Databricks, Python, SQL (for ETL & data transformation). Knowledge of GDPR, data security best practices, and access control policies. Strong problem-solving and analytical skills to optimize data processes. Excellent collaboration More ❯
or a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark, dbt, Airflow OR Kafka, Databricks, and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling, distributed systems, streaming architectures, and ETL/ELT pipelines. Proficiency in SQL and at least one More ❯
or a related field, with a focus on building scalable data systems and platforms. Strong expertise with modern data tools and frameworks such as Spark , dbt , Airflow OR Kafka , Databricks , and cloud-native services (AWS, GCP, or Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least More ❯
life cycle. Understanding of blockchain technologies and data structures. Knowledge of cryptography and its application in blockchain is a plus. Experience with blockchain indexing is a plus. Experience with Databricks for data ingestion and transformation is a plus. Familiarity with Delta Lake and data warehousing concepts is a plus. Strong communication, interpersonal and presentation skills. If interested, please get in More ❯
supporting automated workflows in Alteryx Designer. Experience deploying workflows to the Production Gallery. Knowledge of database fundamentals, data design, SQL, and data warehouse concepts is beneficial. Exposure to PowerBI, Databricks, Azure, and Profisee is advantageous. Knowledge of Json, Python, XML, and R is a plus. Experience with non-relational and unstructured data is beneficial. Familiarity with Azure DevOps or GitHub More ❯
supporting automated workflows in Alteryx Designer. Experience deploying workflows to the Production Gallery. Knowledge of database fundamentals, data design, SQL, and data warehouse concepts is advantageous. Exposure to PowerBI, Databricks, Microsoft Azure, and Profisee is a plus. Knowledge of Json, Python, XML, and R is beneficial. Experience with non-relational databases and unstructured data is advantageous. Familiarity with Azure DevOps More ❯
Minimum of 7 years in data and analytics or a related discipline, with proven experience in product and stakeholder management. Technical Skills: Proficiency in data analytics tools (Power BI, Databricks, Tableau, Python, R, Fabric, Excel), and strong understanding of data architecture, governance, and quality principles. Leadership Skills: Demonstrated ability to lead and inspire a team, with excellent interpersonal and communication More ❯
and supporting automated workflows in Alteryx Designer. Experience deploying workflows to the Production Gallery. Knowledge of database fundamentals, including data design, SQL, and data warehouse concepts. Exposure to PowerBI, Databricks, Microsoft Azure, and Profisee is advantageous. Knowledge of Json, Python, XML, and R is beneficial. Experience with non-relational and unstructured data is a plus. Familiarity with Azure DevOps/ More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
focused environment* Proactive, curious, and capable of balancing technical depth with business understanding* Excellent communication skills and a collaborative mindset Tech Stack/Tools Python SQL Dbt Spark or Databricks GCP (Open to AWS/Azure) CI/CD tooling Benefits * Company profit share scheme* Bupa private healthcare with 24/7 GP access* Up to 8% employer pension contribution More ❯
supporting automated workflows in Alteryx Designer. Experience deploying Alteryx workflows to the Production Gallery. Knowledge of database fundamentals, including data design, SQL, and data warehouse concepts. Exposure to PowerBI, Databricks, Microsoft Azure, and Profisee is advantageous. Knowledge of JSON, Python, XML, and R is beneficial. Experience with non-relational databases and unstructured data is a plus. Familiarity with Azure DevOps More ❯
in: Data modeling and database design (SQL & NoSQL) Cloud-based architecture on Azure (required), and familiarity with AWS Microsoft Fabric and Logic Apps Azure-based machine learning workflows and Databricks on Azure Designing and optimizing data lakes, warehouses, and pipelines Experience implementing data governance, security standards, and compliance practices. Strong understanding of metadata management, data lineage, and data quality frameworks. More ❯
like Apache Airflow or similar. Strong problem-solving and analytical skills. Excellent communication and collaboration skills. Nice to have: Hands-on experience with data warehouse and lakehouse architectures (e.g., Databricks, Snowflake, or similar). Experience with big data frameworks (e.g., Apache Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, or GCP). #J-18808-Ljbffr More ❯