Kent, South East, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
record of delivering machine learning or AI projects end-to-end Hands-on skills in Python, with frameworks like Scikit-learn, TensorFlow, PyTorch, or PySpark Deep understanding of data science best practices, including MLOps Strong stakeholder communication skillsable to translate complex insights into business impact Experience working in cross More ❯
scalable. What we expect from you Degree in Statistics, Maths, Physics, Economics or similar field Programming skills (Python and SQL are a must have, Pyspark is recommended) Analytical Techniques and Technology Experience with and passion for connecting your work directly to the customer experience, making a real and tangible More ❯
GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at More ❯
them. What we expect from you Degree in Statistics, Maths, Physics, Economics or similar field Programming skills (Python and SQL are a must have, Pyspark is recommended) Analytical Techniques and Technology Experience with and passion for connecting your work directly to the customer experience, making a real and tangible More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
delivering AI/ML projects end-to-end, with real business impact Hands-on Python skills including libraries like Scikit-learn, TensorFlow, Keras, and PySpark Strong understanding of model deployment and MLOps best practices Ability to work with real-world, unbalanced and messy datasets Skilled in translating technical work More ❯
in Microsoft Azure cloud technologies would be a bonus. What will be your key responsibilities? Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines … environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the implementation of DevOps and CI/CD methodologies to foster agile collaboration and contribute More ❯
in Microsoft Azure cloud technologies would be a bonus. What will be your key responsibilities? Technical Proficiency: Collaborate in hands-on development using Python, PySpark, and other relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines … environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the implementation of DevOps and CI/CD methodologies to foster agile collaboration and contribute More ❯
ensuring effective collaboration. Design, develop, and optimise scalable data pipelines and infrastructure using AWS (Glue, Athena, Redshift, Kinesis, Step Functions, Lake Formation). Utilise PySpark for distributed data processing, ETL, SQL querying, and real-time data streaming. Architect and implement robust data solutions for analytics, reporting, machine learning, and … ETL architecture. Deep expertise in AWS Data Services, including Glue, Athena, Redshift, Kinesis, Step Functions, and Lake Formation. Strong programming skills in Python and PySpark for data processing and automation. Extensive SQL experience (Spark-SQL, MySQL, Presto SQL) and familiarity with NoSQL databases (DynamoDB, MongoDB, etc.). Proficiency in More ❯
ensuring effective collaboration. Design, develop, and optimise scalable data pipelines and infrastructure using AWS (Glue, Athena, Redshift, Kinesis, Step Functions, Lake Formation). Utilise PySpark for distributed data processing, ETL, SQL querying, and real-time data streaming. Architect and implement robust data solutions for analytics, reporting, machine learning, and … ETL architecture. Deep expertise in AWS Data Services, including Glue, Athena, Redshift, Kinesis, Step Functions, and Lake Formation. Strong programming skills in Python and PySpark for data processing and automation. Extensive SQL experience (Spark-SQL, MySQL, Presto SQL) and familiarity with NoSQL databases (DynamoDB, MongoDB, etc.). Proficiency in More ❯
Are you passionate about revolutionising engineering with AI? Here at Monolith AI we're on a mission to empower engineers to use AI to solve even their most intractable physics problems. We've doubled in size over the last four More ❯
Data Engineer Manager Department: Tech Hub Employment Type: Permanent Location: London Reporting To: Lead Data Engineer Description Contract type: Permanent Hours: Full time, 37.5 hours per week Salary: circa £80,000 depending on experience Location: Canary Wharf WFH policy: Employees More ❯
both short and long-term projects across clustering, propensity modelling, regression, and NLP Occasionally building dashboards for clients YOUR SKILLS AND EXPERIENCE Python/PySpark experience is essential to create propensity models and clustering NLP experience is a plus Commercial awareness and insights experience is needed for this role More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
NLP PEOPLE
and supports the development of data pipelines to support model development. Proficient with software tools that develop data pipelines in a distributed computing environment (PySpark, GlueETL). Supports integration of model pipelines in a production environment. Develops understanding of SDLC for model production. Reviews pipeline designs, makes data model More ❯
We're currently seeking a highly skilled and experienced Principal Data Engineer to join our team as a key technical authority and thought leader for our Data Platform and Engineering team. You'll be responsible for driving the improvement of More ❯
We're seeking a talented and motivated Data Platform Engineer to join our Data Platform and Engineering team. Our aim is to deliver a robust platform that enables Machine Learning and analytics to drive indemnity savings, efficiency benefits and improved More ❯
The Software Engineer will run build and work on enterprise grade software systems using a modern tech stack including PySpark with Databricks for data engineering tasks, infrastructure as code with AWS CDK and GraphQL. As a Software Engineer, you are expected to work with architects to design clean decoupled … experience. Some level of professional working experience. More if no relevant degree. OO and functional programming experience, design patterns, SOLID principles. Experience in Python, PySpark and/or SQL is preferred. Experience with scrum, TDD, BDD, Pairing, Pull Requests, Continuous Integration & Delivery. Continuous Integration tools - Github, Azure DevOps, Jenkins More ❯
Date: 21 Mar 2025 Location: Edinburgh, GB Macclesfield, GB Glasgow, GB Company: Royal London Group Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/office based The Group Data Office (GDO) is responsible for More ❯
Date: 21 Mar 2025 Location: Edinburgh, GB Macclesfield, GB Glasgow, GB Company: Royal London Group Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/office based The Group Data Office (GDO) is responsible for More ❯
Date: 21 Mar 2025 Location: Edinburgh, GB Macclesfield, GB Glasgow, GB Company: Royal London Group Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/office based The Group Data Office (GDO) is responsible for More ❯
Date: 21 Mar 2025 Location: Edinburgh, GB Macclesfield, GB Glasgow, GB Company: Royal London Group Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/office based The Group Data Office (GDO) is responsible for More ❯
We're seeking a talented and motivated Lead Data Platform Administrator to join our Data Platform and Engineering team. Our aim is to deliver a robust platform that enables Machine Learning and analytics to drive indemnity savings, efficiency benefits and More ❯
Haywards Heath, Sussex, United Kingdom Hybrid / WFH Options
First Central Services
Location: Guernsey, Haywards Heath, Home Office (Remote) or Manchester Salary: £85,000 - £100,000 - depending on experience Department: Technology and Data We're First Central Insurance & Technology Group (First Central for short), an innovative, market-leading insurance company. We protect More ❯
Daliburgh, Isle Of South Uist, United Kingdom Hybrid / WFH Options
First Central Services
Location: Guernsey, Haywards Heath, Home Office (Remote) or Manchester Salary: £85,000 - £100,000 - depending on experience Department: Technology and Data We're First Central Insurance & Technology Group (First Central for short), an innovative, market-leading insurance company. We protect More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
Location: Guernsey, Haywards Heath, Home Office (Remote) or Manchester Salary: £85,000 - £100,000 - depending on experience Department: Technology and Data We're First Central Insurance & Technology Group (First Central for short), an innovative, market-leading insurance company. We protect More ❯
closely with teams across trading, finance, compliance, and ops. Profile: Strong experience implementing Snowflake in a lead or senior capacity Solid background in Python , PySpark , and Spark Hands-on with platform setup – ideally with a DevOps-first approach Exposure to AWS environments Experience working with data from trading platforms … or within commodities, banking, or financial services Tech environment: Primary Platform: Snowflake Other Tech: DBT, Databricks, Spark, PySpark, Python Cloud: AWS (preferred), Private Cloud storage Data Sources: Financial/trading systems More ❯