models in close cooperation with our data science team Experiment in your domain to improve precision, recall, or cost savings Requirements Expert skills in Java or Python Experience with Apache Spark or PySpark Experience writing software for the cloud (AWS or GCP) Speaking and writing in English enables you to take part in day-to-day conversations in the More ❯
methodologies. Collaborating with stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as Apache Airflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data engineering More ❯
primarily GCP. Experience with some or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two way process and we want you More ❯
Python, Snowflake • Strong SQL query writing skills and excellent understanding of SQL query performance optimization • Very Good knowledge of Agile and SDLC processes • Strong experience of streaming architecture, preferably Apache Spark. • Knowledge of cloud concepts (Azure), data warehouse and services • Able to demonstrate very good analytical and problem-solving skills. • Sound written and verbal communication skills and ability to More ❯
ability to learn others as needed: Distributed or large-scale systems MySQL/SQL database design, query optimization, and administration Web development using HTML, CSS, JavaScript, Vue/React Apache web server and related modules Cloud platforms such as AWS, Google Cloud, Azure CI/CD pipeline setup, testing, and administration Networking and firewall configuration Natural language processing Responsibilities More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE. Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED. Please either apply by clicking online or emailing me directly to For further information please call me on 07704 152 640. More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions More ❯
technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/MQ/Event Hubs, Kafka … skills. A minimum of 5 years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions More ❯
in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure Data Factory Strong understanding of data architecture principles, data modelling, and data governance Experience with cloud-based data platforms, including Azure and or More ❯
City of London, London, United Kingdom Hybrid / WFH Options
IO Associates
Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates to apply and for more More ❯
Computer Science, Engineering, or a related field, or equivalent industry experience. Preferred Qualifications Experience or interest in mentoring junior engineers. Familiarity with data-centric workflows and pipeline orchestration (e.g., Apache Airflow). Proficiency in data validation, anomaly detection, or debugging using tools like Pandas, Polars, or data.table/R. Experience working with AWS or other cloud platforms. Knowledge of More ❯
the development and adherence to data governance standards. Data-Driven Culture Champion : Advocate for the strategic use of data across the organization. Skills-wise, you'll definitely: Expertise in Apache Spark Advanced proficiency in Python and Pyspark Extensive experience with Databricks Advanced SQL knowledge Proven leadership abilities in data engineering Strong experience in building and managing CI/CD More ❯
data-driven performance analysis and optimization Strong communication skills and ability to work in a team Strong analytical and problem-solving skills PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures Apache NiFi experience Experience building trading controls within an investment bank ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital, and ideas to help our clients, shareholders, and More ❯
managing databases (we use Elasticsearch/MongoDB/PostgreSQL). Experience with SQL. Experience with data versioning tools. Experience developing and maintaining data infrastructure for ETL pipelines, such as Apache Airflow. EPIC JOB + EPIC BENEFITS = EPIC LIFE We pay 100% for benefits except for PMI (for dependents). Our current benefits package includes pension, private medical insurance, health More ❯
Azure, AWS, GCP) Hands-on experience with SQL, Data Pipelines, Data Orchestration and Integration Tools Experience in data platforms on premises/cloud using technologies such as: Hadoop, Kafka, Apache Spark, Apache Flink, object, relational and NoSQL data stores. Hands-on experience with big data application development and cloud data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery More ❯
regulatory requirements. Lead development of the modern OMS using Spring Boot, Angular, and PostgreSQL with scalable deployments via Docker and Kubernetes. Guide the implementation of enterprise integration frameworks (RabbitMQ, Apache Camunda) and analytics/reporting solutions (JasperReports, Bold BI). Team Development & Culture Lead, mentor, and inspire a high-performing R&D organization of software engineers, architects, QA professionals … progressive leadership experience in enterprise software R&D, including legacy system management and modern application development. Technical expertise in Oracle Forms, Java Spring Boot, Angular, PostgreSQL, RabbitMQ, Docker, Kubernetes, Apache Camunda, JasperReports, and business intelligence tools. Experience deploying and maintaining applications across cloud and hybrid infrastructures (Azure, AWS, on-prem). Proven ability to lead distributed and cross-functional More ❯
Substantial experience using tools for statistical modelling of large data sets Some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark or other caching and analytics technologies Expertise in model training, Statistics, model evaluation, deployment and optimisation, including RAG-based architectures. More ❯
Substantial experience using tools for statistical modelling of large data sets Some familiarity with data workflow management tools such as Airflow as well as big data technologies such as Apache Spark or other caching and analytics technologies Expertise in model training, Statistics, model evaluation, deployment and optimisation, including RAG-based architectures. More ❯
with new methodologies to enhance the user experience. Key skills: Senior Data Scientist experience Commercial experience in Generative AI and recommender systems Strong Python and SQL experience Spark/Apache Airflow LLM experience MLOps experience AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
City of London, London, Tottenham Court Road, United Kingdom Hybrid / WFH Options
Cathcart Technology
with new methodologies to enhance the user experience. Key skills: ** Senior Data Scientist experience ** Commercial experience in Generative AI and recommender systems ** Strong Python and SQL experience ** Spark/Apache Airflow ** LLM experience ** MLOps experience ** AWS Additional information: This role offers a strong salary of up to £95,000 (Depending on experience/skill) with hybrid working (2 days More ❯
are constantly looking for components to adopt in order to enhance our platform. What you'll do: Develop across our evolving technology stack - we're using Python, Java, Kubernetes, Apache Spark, Postgres, ArgoCD, Argo Workflow, Seldon, MLFlow and more. We are migrating into AWS cloud and adopting many services that are available in that environment. You will have the More ❯
Jenkins, TeamCity Scripting languages such as PowerShell, bash Observability/Monitoring: Prometheus, Grafana, Splunk Containerisation tools such as Docker, K8S, OpenShift, EC, containers Hosting technologies such as IIS, nginx, Apache, App Service, LightSail Analytical and creative approach to problem solving We encourage you to apply , even if you don't meet all of the requirements. We value your growth More ❯
future-proofing of the data pipelines. ETL and Automation Excellence: Lead the development of specialized ETL workflows, ensuring they are fully automated and optimized for performance using tools like Apache Airflow, Snowflake, and other cloud-based technologies. Drive improvements across all stages of the ETL cycle, including data extraction, transformation, and loading. Infrastructure & Pipeline Enhancement: Spearhead the upgrading of More ❯