Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Smart DCC
you be doing? Design and implement efficient ETL processes for data extraction, transformation, and loading. Build real-time data processing pipelines using platforms like Apache Kafka or cloud-native tools. Optimize batch processing workflows with tools like ApacheSpark and Flink for scalable performance. Infrastructure Automation: Implement … Integrate cloud-based data services with data lakes and warehouses. Build and automate CI/CD pipelines with Jenkins, GitLab CI/CD, or Apache Airflow. Develop automated test suites for data pipelines, ensuring data quality and transformation integrity. Monitoring & Performance Optimization: Monitor data pipelines with tools like Prometheus More ❯
Thornton-Cleveleys, Lancashire, North West, United Kingdom
Victrex Manufacturing Ltd
architecture, data modelling, ETL/ELT processesand data pipeline development. Competency with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark, Kafka etc). Excellent communication and leadership skills, with the ability to engage and influence stakeholders at all levels. Insightful problem-solving skills and More ❯
s degree/PhD in Computer Science, Machine Learning, Applied Statistics, Physics, Engineering or related field Strong mathematical and statistical skills Experience with Python, Spark and SQL Experience implementing and validating a range of machine learning and optimization techniques Effective scientific communication for varied audiences Autonomy and ownership of More ❯
s degree/PhD in Computer Science, Machine Learning, Applied Statistics, Physics, Engineering or related field Strong mathematical and statistical skills Experience with Python, Spark, and SQL Experience implementing and validating a range of machine learning and optimization techniques Effective scientific communication for varied audiences Autonomy and ownership of More ❯
or Development, Content Distribution/CDN, Scripting/Automation, Database Architecture, Cloud Architecture, Cloud Migrations, IP Networking, IT Security, Big Data/Hadoop/Spark, Operations Management, Service Oriented Architecture etc. Experience in a 24x7 operational services or support environment. Experience with AWS Cloud services and/or other More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Qodea
GCP). Proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle. Experience with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow. Proficiency in at least one More ❯
Our systems are self-healing, responding gracefully to extreme loads or unexpected input. We use modern languages like Kotlin and Scala, Data technologies Kafka, Spark, MLflow, Kubeflow, VastStorage, StarRocks and agile development practices. Most importantly, we hire great people from around the world and empower them to be successful. More ❯
object-oriented design principles, and data structures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating test cases using Java/Python Experience in SQL/NoSQL (Oracle, Cassandra) database design … following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with Terraform Experience in creating workflows for Apache Airflow About Roku Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and More ❯
As a Senior BI Developer, you will be at the forefront of creating Analytical Solutions and insights into a wide range of business processes throughout the organisation and playing a core role in our strategic initiatives to enhance data-driven More ❯
to join its innovative team. This role requires hands-on experience with machine learning techniques and proficiency in data manipulation libraries such as Pandas, Spark, and SQL. As a Data Scientist at PwC, you will work on cutting-edge projects, using data to drive strategic insights and business decisions. … e.g. Sklearn) and (Deep learning frameworks such as Pytorch and Tensorflow). Understanding of machine learning techniques. Experience with data manipulation libraries (e.g. Pandas, Spark, SQL). Git for version control. Cloud experience (we use Azure/GCP/AWS). Skills we'd also like to hear about More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Yelp USA
Summary Yelp engineering culture is driven by our values : we're a cooperative team that values individual authenticity and encourages creative solutions to problems. All new engineers deploy working code their first week, and we strive to broaden individual impact More ❯