Wilmslow, England, United Kingdom Hybrid / WFH Options
Mirai Talent
managing data infrastructure, and supporting data products across various cloud environments, primarily Azure. Key Responsibilities: Develop end-to-end data pipelines using Python, Databricks, PySpark, and SQL. Integrate data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. More ❯
managing data infrastructure, and supporting data products across various cloud environments, primarily Azure. Key Responsibilities: Develop end-to-end data pipelines using Python, Databricks, PySpark, and SQL. Integrate data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
Mirai Talent
managing data infrastructure, and supporting data products across various cloud environments, primarily Azure. Key Responsibilities: Develop end-to-end data pipelines using Python, Databricks, PySpark, and SQL. Integrate data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
Mirai Talent
managing data infrastructure, and supporting data products across various cloud environments, primarily Azure. Key Responsibilities: Develop end-to-end data pipelines using Python, Databricks, PySpark, and SQL. Integrate data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. More ❯
Alderley Edge, England, United Kingdom Hybrid / WFH Options
Mirai Talent
managing data infrastructure, and supporting data products across various cloud environments, primarily Azure. Key Responsibilities: Develop end-to-end data pipelines using Python, Databricks, PySpark, and SQL. Integrate data from various sources including APIs, Excel, CSV, JSON, and databases. Manage data lakes, warehouses, and lakehouses within Azure cloud environments. More ❯
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
Microsoft Fabric and Azure ecosystem features and best practices. Requirements Experience as a Data Engineer with Microsoft Fabric or related Azure services. Knowledge of PySpark in notebooks for data analysis. Proficiency in SQL and data modelling. Experience with ELT/ETL tools within Microsoft ecosystem. Understanding of data lake More ❯
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
Warrington, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
and modify code, and update monitoring setups Communicate outages to end users What we are looking for Experience reading and writing code in Python, PySpark, and Java Basic understanding of Spark Ability to navigate pipeline development tools What’s in it for you? Base salary of More ❯
a fast-paced environment, ideally supporting supply chain, logistics or operational teams. Strong analytical and problem-solving skills, with hands-on expertise in Python, PySpark, and SQL for data analysis and transformation. Proficient in building compelling visualisations and dashboards using Power BI and experience working within the Microsoft Fabric More ❯
warrington, cheshire, north west england, united kingdom
Tenth Revolution Group
best practice for the long-term development of the data science function Requirements Experience in a data science leadership position Expert use of Python, PySpark, and PyTorch Strong experience with AI and ML best practice and development Databricks and Azure experience Strong business and commercial acumen Excellent stakeholder management More ❯
Chester, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem More ❯
Warrington, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem More ❯
.Strong leadership capabilities with experience mentoring, developing, and managing high-performing analytics teams.Advanced analytical and problem-solving skills, with hands-on expertise in Python, PySpark, and SQL for data analysis and transformation.Proficiency in Power BI, with experience building intuitive dashboards and working within the Microsoft Fabric ecosystem.Demonstrated ability to More ❯
warrington, cheshire, north west england, united kingdom
Gerrard White
GBMs, Elastic Net GLMs, GAMs, Decision Trees, Random Forests, Neural Nets and Clustering Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at More ❯
Social network you want to login/join with: Data Engineer , Python, PySpark, and SQL, AWS, chester col-narrow-left Client: Athsai Location: chester, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide Job Description … a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and More ❯
Warrington, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
will play a crucial role in designing, developing, and maintaining data architecture and infrastructure. The successful candidate should possess a strong foundation in Python, Pyspark, SQL, and ETL processes, with a demonstrated ability to implement solutions in a cloud environment. Experience - 6-9 Years Location - London Job Type - Hybrid … Permanent Mandatory Skills: Design, build, maintain data pipelines using Python, Pyspark and SQL Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS/AZURE/GCP . Collaborate with data scientists, business analysts to understand their data needs & develop solutions … of our data solutions. Qualifications: Minimum 6+ years of Total experience. At least 4+ years of Hands on Experience using The Mandatory skills - Python, Pyspark, SQL. #J-18808-Ljbffr More ❯