Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing workflows with tools like ApacheSpark and Flink for scalable performance. Infrastructure Automation: Implement … Integrate cloud-based data services with data lakes and warehouses. Build and automate CI/CD pipelines with Jenkins, GitLab CI/CD, or Apache Airflow. Develop automated test suites for data pipelines, ensuring data quality and transformation integrity. Monitoring & Performance Optimization: Monitor data pipelines with tools like Prometheus More ❯
availability. Very strong technical background leading application development - with experience in some or all of the following technologies: Python, Java, Spring Boot, TensorFlow, PyTorch, ApacheSpark, Kafka, Jenkins, Git/Bitbucket, Terraform, Docker, ECS/EKS, IntelliJ, JIRA, Confluence, React/Typescript, Selenium, Redux, Junit, Cucumber/Gherkin. More ❯
s degree/PhD in Computer Science, Machine Learning, Applied Statistics, Physics, Engineering or related field Strong mathematical and statistical skills Experience with Python, Spark and SQL Experience implementing and validating a range of machine learning and optimization techniques Effective scientific communication for varied audiences Autonomy and ownership of More ❯
s degree/PhD in Computer Science, Machine Learning, Applied Statistics, Physics, Engineering or related field Strong mathematical and statistical skills Experience with Python, Spark, and SQL Experience implementing and validating a range of machine learning and optimization techniques Effective scientific communication for varied audiences Autonomy and ownership of More ❯
preston, lancashire, north west england, united kingdom
General Motors
workflows professional motorsports organization. Experience using simulation tools to optimize vehicle performance. Experience with machine learning libraries. Experience with big data tools (e.g. Hadoop, Spark, SQL, and NoSQL Database experience). About GM Our vision is a world with Zero Crashes, Zero Emissions and Zero Congestion and we embrace More ❯
As a Senior BI Developer, you will be at the forefront of creating Analytical Solutions and insights into a wide range of business processes throughout the organisation and playing a core role in our strategic initiatives to enhance data-driven More ❯
to join its innovative team. This role requires hands-on experience with machine learning techniques and proficiency in data manipulation libraries such as Pandas, Spark, and SQL. As a Data Scientist at PwC, you will work on cutting-edge projects, using data to drive strategic insights and business decisions. … e.g. Sklearn) and (Deep learning frameworks such as Pytorch and Tensorflow). Understanding of machine learning techniques. Experience with data manipulation libraries (e.g. Pandas, Spark, SQL). Git for version control. Cloud experience (we use Azure/GCP/AWS). Skills we'd also like to hear about More ❯