and ensuring best practices and understood and followed. Technical Skills and Qualifications Expert knowledge in python including libraries/frameworks such as pandas, numpy, pyspark Good understanding of OOP, software design patterns, and SOLID principles Good experience in Docker Good experience in Linux Good experience in Airflow Good knowledge more »
snowflake schemas. Knowledge of DevOps practices within a Power BI environment. Familiarity with Microsoft Fabric & Databricks. SQL databases expertise, data engineering with Python and PySpark, and knowledge of geospatial concepts and tools. As part of this engagement, you will work on initiatives that redefine business efficiency through AI. You more »
building a massively distributed cloud-hosted data platform that would be used by the entire firm. Tech Stack: Python/Java, Dremio, Snowflake, Iceberg, PySpark, Glue, EMR, dbt and Airflow/Dagster. Cloud: AWS, Lambdas, ECS services This role would focus on various areas of Data Engineering including: End more »
I have placed quite a few candidates with this organisation now, all have said glowing reviews about it, they're the leader in legal representation comparison and have rather interesting, unique data to work with. They are using modern Azure more »
data experience Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
West Midlands, Dudley, West Midlands (County), United Kingdom Hybrid / WFH Options
Concept Resourcing
standardise the Data Warehouse environment and solutions. Required Skills, Knowledge, and Experience: Experience designing, developing, and testing Azure Data Factory/Fabric pipelines, including PySpark Notebooks and Dataflow Gen2 workflows. Previous experience with Informatica Power Centre is desired. Significant experience in designing, writing, editing, debugging, and testing SQL code more »
applied machine learning, probability, statistics, and quantitative risk modelling. High proficiency in Python & SQL. Experience with big data technologies and tools, particularly Databricks and Pyspark, is highly desirable. Experience in agile software development processes is a plus. Experience in insurance, cyber, or a related domain is ideal. Understanding of more »
snowflake schemas. Knowledge of DevOps practices within a Power BI environment. Familiarity with Microsoft Fabric & Databricks. SQL databases expertise, data engineering with Python and PySpark, and knowledge of geospatial concepts and tools. As part of this engagement, you will work on initiatives that redefine business efficiency through AI. You more »
Senior Data Engineer Remote working Salary £65,000 - £70,000 plus benefits DataBricks, PySpark, SQL We are looking for a talented Senior Data Engineer to join one of the UK's leading research and law ranking companies at an exciting time of growth. Build new products, engineer new solutions more »
Employment Type: Permanent
Salary: £60000 - £70000/annum plus remote working and benefits
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
Instances, Azure DB, Data Lakes, Azure Synapse and Power BI, Azure Data Factory). Proficiency in at least one scripting language (i.e. SQL, and PySpark/Python). Proficiency of designing and building API and API Consumption. Familiarity with data visualisations tools such as PowerBI. Proficiency in ETL/… Model, Data Platforms, Data Models, Data Architecture, Data Integration, DIAS, Microsoft Azure, Azure DB, Data Lakes, Azure Synapse, Power BI, Azure Data Factory, SQL, PySpark, Python, APIs, ETL, ELT, CI/CD, Data Pipelines, Data-as-a-Service. Please note that due to a high level of applications, we more »