PySpark Jobs in Glasgow

18 of 18 PySpark Jobs in Glasgow

Python Lead Software Engineer

Glasgow, Scotland, United Kingdom
J.P. Morgan
experience in Data Management, Data Integration, Data Quality, Data Monitoring, and Analytics. Experience leading technologist teams and managing global stakeholders. Proficiency in Python and PySpark for data engineering. Experience building cloud-native applications on platforms such as AWS, Azure, GCP, leveraging cloud services for data storage, processing, and analytics. More ❯
Posted:

Data Engineer - Microsoft Fabric

Glasgow, Scotland, United Kingdom
JR United Kingdom
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
Posted:

Data Engineer

Glasgow, Scotland, United Kingdom
PACeHR
Role : Data Engineer Location : Glasgow ( 3 days in a week) Key Requirements/Expertise Primary skills : Data, Python, pyspark Should have expertise in data engineering experience leveraging technologies such as Snowflake, Azure Data Factory, ADLS, Databricks etc. Should have expertise in writing SQL queries against any RDBMS with query More ❯
Posted:

PythonSoftwareEngineerII

Glasgow, Scotland, United Kingdom
JPMorgan Chase
application development, testing, and operational stability, especially with data pipelines. Proficiency in Python and data manipulation libraries such as NUMPY and PANDAS. Experience with PySpark, including analysis, pipeline building, tuning, and feature engineering. Knowledge of SQL and NoSQL databases, including joins, aggregations, and tuning. Experience with ETL processes and More ❯
Posted:

Senior Data Architect

Glasgow, Scotland, United Kingdom
JR United Kingdom
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
Posted:

Software Engineer II - AWS

Glasgow, Scotland, United Kingdom
J.P. Morgan
Databases such as MongoDB Experience in various messaging technologies such as Kafka Cloud Certifications including AWS Developer Associate, AWS Solutions Architect Associate Experience with PySpark Good understanding of event based architecture AI/ML field knowledge and trends Experience with Java, Big Data technologies will be a strong plus More ❯
Posted:

Software Engineer II - Java Full-stack Developer

Glasgow, Scotland, United Kingdom
TN United Kingdom
technical processes within a technical discipline (., cloud, artificial intelligence, machine learning, mobile, Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks, Pyspark Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. #J-18808-Ljbffr More ❯
Posted:

Lead Software Engineer - CIB

Glasgow, Scotland, United Kingdom
ZipRecruiter
testing, and operational stability. Proficient in coding in Python. Proficient in the use of basic data science libraries in Python (NumPy, pandas, scikit-learn, pyspark). Experience in developing, debugging, and maintaining code in a large corporate environment with modern programming and database querying. Overall knowledge of the Software More ❯
Posted:

Lead Software Engineer - CIB | Glasgow, UK

Glasgow, Scotland, United Kingdom
JPMorgan Chase & Co
testing, and operational stability. Proficient in coding in Python. Proficient in the use of basic data science libraries in Python (NumPy, pandas, scikit-learn, pyspark). Experience in developing, debugging, and maintaining code in a large corporate environment with modern programming languages and database querying languages. Overall knowledge of More ❯
Posted:

Director of Software Engineering, Payments EMEA Regulatory Data | Glasgow, UK

Glasgow, Scotland, United Kingdom
JPMorgan Chase & Co
as a Product Owner or Product Manager. Practical cloud-native development experience. Expertise in Computer Science, Engineering, Mathematics, or related fields. Deep knowledge of Pyspark, AWS Cloud, Databricks, Java. Experience in implementing scaled platforms. Excellent communication skills for diverse stakeholders. Preferred Qualifications, Capabilities, and Skills Hands-on coding experience. More ❯
Posted:

Lead Software Engineer

Glasgow, Scotland, United Kingdom
JPMorganChase
etc.) In-depth knowledge of the financial services industry and their IT systems Practical cloud native experience Preferred Qualifications, Capabilities, And Skills Knowledge of PySpark & Databricks is desirable Knowledge with MQ, Kafka ABOUT US J.P. Morgan is a global leader in financial services, providing strategic advice and products to More ❯
Posted:

Director of Software Engineering, Payments EMEA Regulatory Data

Glasgow, Scotland, United Kingdom
ZipRecruiter
Experience as a Product Owner or Product Manager. Practical cloud experience. Expertise in Computer Science, Engineering, Mathematics, or related fields. Knowledge of technologies like Pyspark, AWS Cloud, Databricks, Java. Experience implementing scaled platforms. Excellent communication skills for diverse stakeholders. Additional qualifications Experience working at the code level. Understanding of More ❯
Posted:

Python Software Engineer II

Glasgow, Scotland, United Kingdom
TN United Kingdom
with one or more modern programming languages and database querying languages Demonstrable ability to code in one or more languages such as Python or PySpark Experience across the whole Software Development Life Cycle Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security Emerging knowledge of More ❯
Posted:

Data Engineer

Glasgow, Scotland, United Kingdom
PURVIEW
that do not necessarily have all of the qualifications, but have sufficient experience and talent. Responsibilities for technical data analyst Primary skills : Data, Python, pyspark Exp in Finance or Banking Experience working with enterprise Master Data Management tools to validate, match, and merge various master data sources into a More ❯
Posted:

Founding Machine Learning Engineer

Glasgow, Scotland, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem More ❯
Posted:

Data Analyst

Glasgow, Scotland, United Kingdom
Hybrid / WFH Options
PURVIEW
that do not necessarily have all of the qualifications, but have sufficient experience and talent. Responsibilities for technical data analyst Primary skills : Data, Python, pyspark Experience working with enterprise Master Data Management tools to validate, match, and merge various master data sources into a common view Experience with data More ❯
Posted:

Data Analyst

Glasgow, Scotland, United Kingdom
Hellowork Consultants
Associate @ Hellowork consultants| Connecting Top Talent Across UK & European Union | Helping Companies Scale with Exceptional... Responsibilities for technical data analyst Primary skills : Data, Python, pyspark Experience working with enterprise Master Data Management tools to validate, match, and merge various master data sources into a common view Experience with data More ❯
Posted:

Senior Data Engineer London Hybrid(6+ Years)

Glasgow, Scotland, United Kingdom
Hybrid / WFH Options
JR United Kingdom
will play a crucial role in designing, developing, and maintaining data architecture and infrastructure. The successful candidate should possess a strong foundation in Python, Pyspark, SQL, and ETL processes, with a demonstrated ability to implement solutions in a cloud environment. Experience - 6-9 Years Location - London Job Type - Hybrid … Permanent Mandatory Skills: Design, build, maintain data pipelines using Python, Pyspark and SQL Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS/AZURE/GCP . Collaborate with data scientists, business analysts to understand their data needs & develop solutions … of our data solutions. Qualifications: Minimum 6+ years of Total experience. At least 4+ years of Hands on Experience using The Mandatory skills - Python, Pyspark, SQL. #J-18808-Ljbffr More ❯
Posted: