data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux containers (e.g., Docker). · Related/complementary open-source software platforms and languages (e.g., Scala, Python, Java, Linux). · Experience with both relational (RDBMS) and non-relational databases. · Analytical and problem-solving skills applied to big data datasets. · Experience working on projects with agile/ More ❯
Identify opportunities for automation and recommend tools to improve data engineering workflows. Documentation: Maintain detailed technical documentation for all solutions and processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems More ❯
Strong understanding of DevOps methodologies and principles. Solid understanding of data warehousing data modeling and data integration principles. Proficiency in at least one scripting/programming language (e.g. Python Scala Java). Experience with SQL and NoSQL databases. Familiarity with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication interpersonal and presentation skills. Desired More ❯
services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts to both technical and non More ❯
context of continuous improvement. Continuously enhance the quality of our tools and applications through bug fixes and code refactoring. Leverage the latest data technologies and programming languages, including Python, Scala, and Java, along with systems like Spark, Kafka, and Airflow, within cloud services such as AWS. Ensure the ongoing maintenance, troubleshooting, optimization, and reliability of data systems, including timely resolution … leading services for customers. What You'll Need: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proficiency in programming languages such as Python, Scala or Java. SQL knowledge for database querying and management. Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles. Proven ability to design, build, and maintain More ❯
context of continuous improvement. Continuously enhance the quality of our tools and applications through bug fixes and code refactoring. Leverage the latest data technologies and programming languages, including Python, Scala, and Java, along with systems like Spark, Kafka, and Airflow, within cloud services such as AWS. Ensure the ongoing maintenance, troubleshooting, optimization, and reliability of data systems, including timely resolution … leading services for customers. What You'll Need: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proficiency in programming languages such as Python, Scala or Java. SQL knowledge for database querying and management. Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles. Proven ability to design, build, and maintain More ❯
Azure). Deep understanding of data modeling , distributed systems , streaming architectures , and ETL/ELT pipelines . Proficiency in SQL and at least one programming language such as Python , Scala , or Java . Demonstrated experience owning and delivering complex systems from architecture through implementation. Excellent communication skills with the ability to explain technical concepts to both technical and non-technical More ❯
in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or in the cloud. Experience delivering multiple solutions using key techniques such as Governance, Architecture, Data Modelling, ETL More ❯
to having resided in the UK for at least the past 5 years and being a UK national or dual UK national. Experience in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Experience with the design, build and maintenance of data pipelines and infrastructure More ❯
Company When it comes to innovation and achievement there are few organisations with a better track record. Join us and you’ll be able to play a big part in the success of our highly successful, fast-paced business that More ❯
technical skills: o Expert-level proficiency in SQL (e.g., MySQL, PostgreSQL, Redshift) and NoSQL databases (e.g., MongoDB, Cassandra). o Proficiency in one or more programming languages (e.g., Python, Scala, Java) for data integration, automation, and problem-solving. o In-depth experience with cloud data platforms (AWS, Azure, Google Cloud) and Microsoft services (e.g., Azure Storage, Azure Databricks, Azure Synapse More ❯
AWS, GCP, or Azure) Deep understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts to both technical and non More ❯
Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Familiarity with data warehousing, ETL/ELT processes, and analytics engineering Programming proficiency in Python, Scala or Java Experience operating in a cloud-native environment (e.g. AWS, GCP, or Azure) Excellent stakeholder management and communication skills A strategic mindset, with a practical approach to delivery and More ❯
platforms supporting both batch and real-time processing architectures. Deep understanding of data warehousing, ETL/ELT pipelines, and analytics engineering principles. Proficient in programming languages such as Python, Scala, or Java, and experienced with cloud platforms (AWS, GCP, or Azure). Experience working with privacy-sensitive data and implementing comprehensive observability and governance solutions. Strong technical foundation with a More ❯
GitHub proficiency Strong organizational, analytical, problem-solving, and communication skills Comfort working with remote teams and distributed delivery models Additional skills that are a plus: Programming languages such as Scala, Rust, Go, Angular, React, Kotlin Database management with PostgreSQL Experience with ElasticSearch, observability tools like Grafana and Prometheus What this role can offer Opportunity to deepen understanding of AI and More ❯
services to distribute data to different consumers across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and data governance platform Programming Languages: Java, Scala, Scripting Database Technologies: MongoDB, ElasticSearch, Cassandra, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST, Spring Boot, Jersey Build and CI/CD Technologies: Gradle, Jenkins, Gitlabs, SVN Cloud Technologies … in code review and collaborate in experimenting new tech stacks SKILLS AND EXPERIENCE WE ARE LOOKING FOR Computer Science, Mathematics, Engineering or other related degree at bachelors level Java, Scala, Scripting, REST, Spring Boot, Jersey Kafka, Spark, Hadoop, MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE 3+ years of hands-on experience on relevant technologies ABOUT GOLDMAN SACHS At Goldman Sachs More ❯
GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow). Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases. Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies. Strong proficiency in SQL More ❯
What You Bring: 2+ years in data engineering or related roles Bachelor’s in CS, Engineering, Mathematics, Finance, etc. Proficiency in Python, SQL , and one or more: R, Java, Scala Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB) Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi) Bonus: experience with More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
KPMG UK
it? Expertise in data engineering/analytics/architecture/security using native technologies of least one cloud platform (AWS, Azure, GCP) Expertise in prominent languages such as Python, Scala, Spark, SQL. Experience working with any database technologies from an application programming perspective - Oracle, MySQL, Mongo DB etc. Expertise in leading the design, build and maintenance of data pipelines and More ❯
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯
Bolton, Greater Manchester, UK Hybrid / WFH Options
Datatech Analytics
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯
Leigh, Greater Manchester, UK Hybrid / WFH Options
Datatech Analytics
control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data More ❯