storage, data pipelines to ingest and transform data, and querying & reporting of analytical data. You've worked with technologies such as Python, Spark, SQL, Pyspark, PowerBI etc. You're a problem-solver, pragmatically exploring options and finding effective solutions. An understanding of how to design and build well-structured More ❯
working in cloud-native environments (AWS preferred) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Radley James
working in cloud-native environments (AWS preferred) Strong proficiency with Python and SQL Extensive hands-on experience in AWS data engineering technologies, including Glue, PySpark, Athena, Iceberg, Databricks, Lake Formation, and other standard data engineering tools. Familiarity with DevOps practices and infrastructure-as-code (e.g., Terraform, CloudFormation) Solid understanding More ❯
of experience in Data Engineering, with a focus on cloud platforms (Azure, AWS, GCP). You have a proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog). You have extensive experience in ETL/ELT development and data pipeline orchestration (Databricks Workflows, DLT, Airflow, ADF More ❯
Candidate Profile Strong technical background in SQL, scripting languages (Python, TypeScript, JavaScript), databases, ML/LLM models, and big data technologies like Apache Spark (PySpark, Spark SQL). Self-starter with the ability to work from requirements to solutions. Effective communicator, passionate learner, and accountable owner of deliverables. Customer More ❯
to Octopus offices across Europe and the US. Our Data Stack: SQL-based pipelines built with dbt on Databricks Analysis via Python Jupyter notebooks Pyspark in Databricks workflows for heavy lifting Streamlit and Python for dashboarding Airflow DAGs with Python for ETL running on Kubernetes and Docker Django for More ❯
experience. Preferred Qualifications: Masters Degree in a related field, and/or related certification/s 1+ years of experience programming languages (e.g. Python, Pyspark) 1+ years of experience with data orchestration/ETL tools (Airflow, Nifi) Experience with Snowflake, Databricks/EMR/Spark, and/or Airflow. More ❯
machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
machine learning, with a strong portfolio of relevant projects. Proficiency in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like Apache Airflow, Spark, or Kafka. Strong understanding of SQL, NoSQL More ❯
2+ years in a senior role Deep expertise with AWS services (S3, Glue, Lambda, SageMaker, Redshift, etc.) Strong Python and SQL skills; experience with PySpark a bonus Familiarity with containerization (Docker), orchestration (Airflow, Step Functions), and infrastructure as code (Terraform/CDK) Solid understanding of machine learning model lifecycle More ❯
Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's in it for More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Peaple Talent
Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's in it for More ❯
System Integration, Application Development or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensional modelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. More ❯
System Integration, Application Development or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensional modelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. More ❯
cloud environments to support ambitious data initiatives and future projects. IT Manager - The Skills You'll Need to Succeed: Mastery of Databricks, Python/PySpark, and SQL/SparkSQL. Experience in Big Data/ETL (Spark and Databricks preferred). Expertise in Azure. Proficiency with version control systems (Git More ❯
Engineering. Develop customer relationships and build internal partnerships with account executives and teams. Prior experience with coding in a core programming language (i.e., Python, PySpark, or SQL) and willingness to learn a base level of Spark. Proficient with Big Data Analytics technologies, including hands-on expertise with complex proofs More ❯
propositions. Develop customer relationships and build internal partnerships with account executives and teams. Prior experience with coding in a core programming language (i.e., Python, PySpark or SQL) and willingness to learn a base level of Spark. Hands-on expertise with complex proofs-of-concept and public cloud platforms (AWS More ❯
/Experience: The ideal candidate will have the following: Proven experience leading data engineering projects, especially cloud migration initiatives. Hands-on experience with Python, PySpark, SQL, and Scala. Deep knowledge of modern data architecture, data modeling, and ELT/ETL practices. Strong grasp of data security, governance, and compliance More ❯
and contributing to their continuous improvement. n What do I need? n Proficiency in Azure and its data related services. n Strong SQL and PySpark skills, with a focus on writing efficient, readable, modular code. n Experience of development on modern cloud data platforms (e.g. Databricks, Snowflake, RedShift). More ❯
similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark with Data Engineering skills Ownership and ability to drive implementation/solution design More ❯
london, south east england, United Kingdom Hybrid / WFH Options
83zero
similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark with Data Engineering skills Ownership and ability to drive implementation/solution design More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83zero Ltd
similar tools is valuable (Collibra, Informatica Data Quality/MDM/Axon etc.) Data Architecture experience is a bonus Python, Scala, Databricks Spark and Pyspark with Data Engineering skills Ownership and ability to drive implementation/solution design More ❯
working with Databricks in a production environment. Strong background in ETL/ELT pipeline design and implementation within Databricks. Proficiency in SQL, Python, and PySpark for data processing and analysis. Experience with streaming technologies such as Kafka for real-time data processing. Experience in data migration and integration for More ❯
and mitigate risks, issues, or control weaknesses in your daily work What we're looking for Strong experience with dbt Proficiency in SQL, Python, Pyspark, or other relevant data processing languages; familiarity with cloud platforms like AWS, GCP, or Azure is desirable Excellent problem-solving skills and attention to More ❯