Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Scott Logic
storage, data pipelines to ingest and transform data, and querying & reporting of analytical data. You've worked with technologies such as Python, Spark, SQL, Pyspark, PowerBI etc. You’ve got a background in software engineering, including Front End technologies like JavaScript. You’re a problem-solver, pragmatically exploring options More ❯
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
or a related technical field Experience with object-oriented programming preferred General familiarity with some of the technologies we use: Python, Apache Spark/PySpark, Java/Spring Amazon Web Services SQL, relational databases Understanding of data structures and algorithms Interest in data modeling, visualisation, and ETL pipelines Knowledge More ❯
or a related technical field Experience with object-oriented programming preferred General familiarity with some of the technologies we use: Python, Apache Spark/PySpark, Java/Spring Amazon Web Services SQL, relational databases Understanding of data structures and algorithms Interest in data modeling, visualisation, and ETL pipelines Knowledge More ❯
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine learning modelling techniques More ❯
coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. Text, PDFs, jpgs, call recordings, video, etc. Knowledge of machine learning modelling techniques More ❯
equivalent experience Professional experience as a Software Engineer Full-stack engineer – you can tackle any problem Experience with most or all of: React, Python, PySpark, Databricks, AWS & AI Self-aware, self-starter who values having impact Our Values Act Like an Owner - Think and operate with intention, purpose and More ❯
to take on new challenges A strong ownership mentality and drive to solve the most important problems. Nice to have: AWS knowledge, experience with PySpark, Databricks, or other big data technologies Our Values Act Like an Owner - Think and operate with intention, purpose and care. Own outcomes. Build Together More ❯
partners. Technical skills preferred in any or all of the following: Microsoft Excel, Google Suite, Atlassian Jira/Confluence, SQL, Python (Jupyter Notebook, Pandas, PySpark, etc.), Databricks, Bloomberg, Pitchbook, or other common financial services systems and applications. Our Values Act Like an Owner - Think and operate with intention, purpose More ❯
Python, R, Scala , etc. (Python preferred). Proficiency in database technologies such as SQL, ETL, No-SQL, Data Warehousing, and Big Data technologies e.g. pySpark, Hive. About the Company Accenture is a global professional services company offering expertise in digital, cloud, and security solutions across various industries worldwide. #J More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem More ❯
This job is brought to you by Jobs/Redefined, the UK's leading over-50s age inclusive jobs board. Job Title: Senior Data Modeller Contract Type: Permanent Location: Wilmslow or Edinburgh or Glasgow Working style: Hybrid 50% home/ More ❯
Social network you want to login/join with: Data Engineer , Python, PySpark, and SQL, AWS, edinburgh col-narrow-left Client: Athsai Location: edinburgh, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide Job Description … a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
JR United Kingdom
will play a crucial role in designing, developing, and maintaining data architecture and infrastructure. The successful candidate should possess a strong foundation in Python, Pyspark, SQL, and ETL processes, with a demonstrated ability to implement solutions in a cloud environment. Experience - 6-9 Years Location - London Job Type - Hybrid … Permanent Mandatory Skills: Design, build, maintain data pipelines using Python, Pyspark and SQL Develop and maintain ETL processes to move data from various data sources to our data warehouse on AWS/AZURE/GCP . Collaborate with data scientists, business analysts to understand their data needs & develop solutions … of our data solutions. Qualifications: Minimum 6+ years of Total experience. At least 4+ years of Hands on Experience using The Mandatory skills - Python, Pyspark, SQL. #J-18808-Ljbffr More ❯
Job Title: MS Fabric Architect Location: Edinburgh, UK (Hybrid) FTC Contract of 6 months Job Description: The MS Fabric Architect role requires expertise in designing and implementing scalable data solutions with a focus on cloud architectures and Microsoft Fabric technologies. More ❯