Manchester, North West, United Kingdom Hybrid / WFH Options
INFUSED SOLUTIONS LIMITED
culture. Key Responsibilities Design, build, and maintain scalable data solutions to support business objectives. Work with Microsoft Fabric to develop robust data pipelines. Utilise ApacheSpark and the Spark API to handle large-scale data processing. Contribute to data strategy, governance, and architecture best practices. Identify and … approaches. Collaborate with cross-functional teams to deliver projects on time . Key Requirements ? Hands-on experience with Microsoft Fabric . ? Strong expertise in ApacheSpark and Spark API . ? Knowledge of data architecture, engineering best practices, and governance . ? DP-600 & DP-700 certifications are highly More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing workflows with tools like ApacheSpark and Flink for scalable performance. Infrastructure Automation: Implement … Integrate cloud-based data services with data lakes and warehouses. Build and automate CI/CD pipelines with Jenkins, GitLab CI/CD, or Apache Airflow. Develop automated test suites for data pipelines, ensuring data quality and transformation integrity. Monitoring & Performance Optimization: Monitor data pipelines with tools like Prometheus More ❯
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
Client Server
Data Engineer (Python Spark SQL) *Newcastle Onsite* to £70k Do you have a first class education combined with Data Engineering skills? You could be progressing your career at a start-up Investment Management firm that have secure backing, an established Hedge Fund client as a partner and massive growth … scientific discipline, backed by minimum A A B grades at A-level You have commercial Data Engineering experience working with technologies such as SQL, ApacheSpark and Python including PySpark and Pandas You have a good understanding of modern data engineering best practices Ideally you will also have … will earn a competitive salary (to £70k) plus significant bonus and benefits package. Apply now to find out more about this Data Engineer (Python Spark SQL) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're More ❯
lifecycle management, including data pipelines, feature engineering, and model serving. Knowledge of MLOps practices, including versioning, monitoring, and automation. Familiarity with big data technologies (Spark, Hadoop, Databricks) is a plus. Strong problem-solving skills and ability to translate business needs into ML solutions. Excellent communication and leadership skills. Why More ❯
Richmond, North Yorkshire, Yorkshire, United Kingdom
Datix Limited
knowledge of programming languages, specifically Python and SQL. Expertise in data management, data architecture, and data visualization techniques. Experience with data processing frameworks like ApacheSpark, Hadoop, or Flink. Strong understanding of database systems (SQL and NoSQL) and data warehousing technologies. Familiarity with cloud computing platforms (AWS, Azure More ❯
processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. Experience working with one or more of Spark, Kafka, or Snowflake My client have very limited interview slots and they are looking to fill this vacancy within the next 2 weeks. I More ❯
strong grasp of modern cloud architectures.?Expertise in data pipelines, ETL/ELT, and working with structured & unstructured data.?Strong skills in SQL, Python, Spark, or other relevant technologies.?Prior exposure to Microsoft Fabric is a huge plus -- but if you're eager to learn, we'll get you … repositories. 7. Document data architecture, processes, and workflows for reference and knowledge sharing. 8. Utilise programming languages (e.g., C#, Python, SQL) and technologies (e.g., ApacheSpark, SSIS, .NET) to manipulate and analyse data. 9. Participate in code reviews, version control (e.g., using Git), and other software development best More ❯
Newcastle Upon Tyne, England, United Kingdom Hybrid / WFH Options
Noir
in a data-focused role, with a strong passion for working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and ApacheSpark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services More ❯
availability. Very strong technical background leading application development - with experience in some or all of the following technologies: Python, Java, Spring Boot, TensorFlow, PyTorch, ApacheSpark, Kafka, Jenkins, Git/Bitbucket, Terraform, Docker, ECS/EKS, IntelliJ, JIRA, Confluence, React/Typescript, Selenium, Redux, Junit, Cucumber/Gherkin. More ❯
Utilise Azure Databricks and adhere to code-based deployment practices Essential Skills: Over 3 years of experience with Databricks (including Lakehouse, Delta Lake, PySpark, Spark SQL) Strong proficiency in SQL with 5+ years of experience Extensive experience with Azure Data Factory Proficiency in Python programming Excellent stakeholder/client More ❯
HBase, Elasticsearch). Build, operate, maintain, and support cloud infrastructure and data services. Automate and optimize data engineering pipelines. Utilize big data technologies (Databricks, Spark). Develop custom security applications, APIs, AI/ML models, and advanced analytic technologies. Experience with threat detection in Azure Sentinel, Databricks, MPP Databases More ❯
Knutsford Contract Role Job Description: AWS Services: Glue, Lambda, IAM, Service Catalogue, Cloud Formation, Lake Formation, SNS, SQS, Event Bridge Language & Scripting: Python and Spark ETL: DBT Good to Have: Airflow, Snowflake, Big Data (Hadoop), and Teradata Responsibilities: Serve as the primary point of contact for all AWS related More ❯
Cassandra, and Redis. In-depth knowledge of ETL/ELT pipelines, data transformation, and storage optimization. Skilled in working with big data frameworks like Spark, Flink, and Druid. Hands-on experience with both bare metal and AWS environments. Strong programming skills in Python, Java, and other relevant languages. Proficiency More ❯
Trafford Park, Trafford, Greater Manchester, United Kingdom Hybrid / WFH Options
ISR RECRUITMENT LIMITED
cloud solutions Handling real-time data processing and ETL jobs Applying AI and data analytics to large datasets Working with big data tools like ApacheSpark and AWS technologies such as Elastic MapReduce, Athena and Lambda Please contact Edward Laing here at ISR Recruitment to learn more about More ❯
Employment Type: Permanent
Salary: £75000 - £85000/annum (plus excellent company benefits)
trafford park, north west england, united kingdom Hybrid / WFH Options
ISR Recruitment
cloud solutions Handling real-time data processing and ETL jobs Applying AI and data analytics to large datasets Working with big data tools like ApacheSpark and AWS technologies such as Elastic MapReduce, Athena and Lambda Please call Edward Laing here at ISR Recruitment on More ❯
of data visualization techniques and tools (e.g., Tableau, Power BI, matplotlib). Experience with cloud-based platforms (AWS, Azure) and big data tools (Hadoop, Spark) is a plus. Educational Background : A degree in Actuarial Science, Mathematics, Statistics, Computer Science, Data Science, or a related field. Progression towards actuarial exams More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Evri
flexibility for remote work. Office visits required for key meetings and collaboration sessions. Key Responsibilities: Develop and maintain scalable data pipelines using Databricks and ApacheSpark to process logistics and delivery data Design ETL workflows that integrate data from multiple delivery and warehouse systems Development of Data Marts More ❯
HBase, Elasticsearch). Build, operate, maintain, and support cloud infrastructure and data services. Automate and optimize data engineering pipelines. Utilize big data technologies (Databricks, Spark). Develop custom security applications, APIs, AI/ML models, and advanced analytic technologies. Experience with threat detection in Azure Sentinel, Databricks, MPP Databases More ❯
datasets, data wrangling, and data preprocessing. Ability to work independently and lead projects from inception to deployment. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure). Preferred Skills: MSc or PhD in Computer Science, Artificial Intelligence, or related field. ADDITIONAL NOTES: Ability More ❯
Altrincham, Cheshire, United Kingdom Hybrid / WFH Options
INRIX, Inc
one or more of the following would also be beneficial : Scala AWS services like Kinesis, RDS, Elasticache, S3, Athena, Data pipeline, Glue, Lambda, EMR, Spark, EC2, ECS, CloudWatch or Elastic Beanstalk Jenkins or similar tools would also be a plus A team player - we highly regard collaboration. Knowledge or More ❯
this role, both written & spoken Preferred (but not required) to have: Hands-on experience with Python Experience working with modern data technology (e.g. dbt, spark, containers, devops tooling, orchestration tools, git, etc.) Experience with AI, data science and machine learning technologies People want to buy from people who understand More ❯
s degree/PhD in Computer Science, Machine Learning, Applied Statistics, Physics, Engineering or related field Strong mathematical and statistical skills Experience with Python, Spark and SQL Experience implementing and validating a range of machine learning and optimization techniques Effective scientific communication for varied audiences Autonomy and ownership of More ❯
that meet the evolving needs of the business. Utilise your strong background in data engineering, combined with your existing experience using SQL, Python and ApacheSpark in production environments. The role will entail strong problem-solving skills, attention to detail, and the ability to work independently while collaborating More ❯