You’ll build robust data infrastructure to enable smarter audit and risk insights. You’ll design scalable ETL/ELT pipelines in Python (with PySpark) and orchestrate them using tools like Databricks and Snowflake. You’ll work with structured and unstructured data across the firm, integrating APIs, batch loads More ❯
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
and across multiple teams. Ensuring data privacy and security in all data-driven interfaces. Set yourself apart: Good knowledge of SQL/Python/PySpark development. Proven understanding of Azure cloud principles & Azure DevOps. Understanding of SQL optimization best practices. Understanding of data quality principles and best practices. What More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Linaker Limited
Knowledge of Data Warehouse/Data Lake architectures and technologies. Strong working knowledge of a language for data analysis and scripting, such as Python, Pyspark, R, Java, or Scala. Experience with any of the following would be desirable but not essential; Microsoft’s Fabric data platform, Experience with ADF More ❯
Skills A degree or equivalent in a science or quantitative subject Strong analytics expertise: We primarily operate within a Microsoft Azure, Databricks, SQL, Python, Pyspark & Powerbi data environment. So, these skills will be very important. Pricing experience is desired Experience of modelling, data management, information systems and related software More ❯
Skills – Able to extract meaningful insights from complex datasets and demonstrate clear business impact. Technical Expertise – Strong experience with SQL, Power BI, Python/PySpark; Databricks is a plus but not essential. Stakeholder Management – Confident in engaging with senior stakeholders, translating data into compelling narratives that drive business decisions. More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
will play a crucial role in designing, developing, and maintaining data architecture and infrastructure. The successful candidate should possess a strong foundation in Python, Pyspark, SQL, and ETL processes, with a demonstrated ability to implement solutions in a cloud environment. Experience: 6-9 Years Location: Birmingham Job Type: Hybrid … Permanent Mandatory Skills: Design, build, maintain data pipelines using Python, Pyspark, and SQL Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS/Azure/GCP Collaborate with data scientists and business analysts to understand their data needs & develop solutions … performance and scalability of our data solutions. Qualifications: Minimum 6+ years of total experience. At least 4+ years of hands-on experience using Python, Pyspark, and SQL. #J-18808-Ljbffr More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
DWP Digital
Data Engineer Pay up to £38,373 , plus 28.97% employer pension contributions, hybrid working, flexible hours, and great work life balance. As a Data Engineer you will be working within a high-performing team of engineers, developing data-centric solutions More ❯