and scalable environments for our data platforms. Leveraging cloud-native technologies and AWS tools such as AWS S3, EKS, Glue, Airflow, Trino, and Parquet, you will prepare to adopt Apache Iceberg for greater performance and flexibility. You'll address high-performance data workloads, ensuring seamless execution of massive queries, including 600+ billion-row queries in Redshift, by designing and More ❯
developing and implementing enterprise data models. Experience with Interface/API data modelling. Experience with CI/CD GITHUB Actions (or similar) Knowledge of Snowflake/SQL Knowledge of Apache Airflow Knowledge of DBT Familiarity with Atlan for data catalog and metadata management Understanding of iceberg tables Who we are: Were a business with a global reach that empowers More ❯
ideally in a high-ownership, fast-paced environment. Nice to have: Experience working in the Payments, Fintech, or Financial Crime domain (e.g., fraud detection, AML, KYC). Experience with Apache Flink or other streaming data frameworks is highly desirable. Experience working in teams, building and maintaining data science and AI solutions. Experience integrating with third-party APIs, especially in More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data governance More ❯
MySQL, PostgreSQL, or Oracle. Experience with big data technologies such as Hadoop, Spark, or Hive. Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow. Proficiency in Python and at least one other programming language such as Java, or Scala. Willingness to mentor more junior members of the team. Strong analytical and problem More ❯
data into a data platform using Fivetran. Experience of developing BI dashboards using Power BI. Knowledge of security concepts relevant to Azure. Experience of workflow management tools such as Apache Airflow. Interested in the role? Complete the online application. We look forward to getting to know you. Discover more about LGT Wealth Management A message from our CEO Ben More ❯
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research company More ❯
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field #LI-PM1 Get to know us better... YouGov is a global online More ❯
might be more valuable than your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful. Experience More ❯
technical direction to a growing team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilising technologies such as EKS, S3, FSX. The main purpose of this role is to More ❯
requirements into data solutions Monitor and improve pipeline performance and reliability Maintain documentation of systems, workflows, and configs Tech environment Python, SQL/PLSQL (MS SQL + Oracle), PySpark Apache Airflow (MWAA), AWS Glue, Athena AWS services (CDK, S3, data lake architectures) Git, JIRA You should apply if you have: Strong Python and SQL skills Proven experience designing data More ❯
and BI . Advanced skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP More ❯
and BI . Advanced skills in SQL and Python , with hands-on experience in relational databases across cloud and on-prem environments. Familiarity with modern data technologies such as Apache Spark , Kafka , or Snowflake . A comprehensive understanding of the data engineering lifecycle, including Agile delivery , DevOps , Git , APIs , containers , microservices , and pipeline orchestration . Nice to have: DP More ❯
evolution Has experience (or strong interest) in building real-time or event-driven architectures Modern Data Stack Includes: Python , SQL Snowflake , Postgres AWS (S3, ECS, Terraform) Airflow , dbt , Docker Apache Spark , Iceberg What they're looking for: Solid experience as a Senior/Lead/Principal Data Engineer, ideally with some line management or mentoring Proven ability to design More ❯
governance SME; support teams with tooling, guidance, and best practices. About You: Strong technical foundation in data governance architecture and tooling. You've worked with tools such as DataHub, Apache Airflow, AWS, dbt, Snowflake, BigQuery , or similar. Hands-on experience building and maintaining centralized data inventories, business glossaries, and data mapping frameworks. Proficient in automating data classification and lineage More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
and data validation techniques. Experience using test automation frameworks for data pipelines and ETL workflows Strong communication and stakeholder management skills. Nice-to-Have: Hands-on experience with Databricks , Apache Spark , and Azure Deequ . Familiarity with Big Data tools and distributed data processing. Experience with data observability and data quality monitoring. Proficiency with CI/CD tools like More ❯
methodologies. Collaborating with stakeholders to define data strategies, implement data governance policies, and ensure data security and compliance. About you: Strong technical proficiency in data engineering technologies, such as Apache Airflow, ClickHouse, ETL tools, and SQL databases. Deep understanding of data modeling, ETL processes, data integration, and data warehousing concepts. Proficiency in programming languages commonly used in data engineering More ❯
to join a fast-growing team that plays an integral part of the revenue producing arm of a company, then our team is for you. Technologies include Scala, Python, Apache Flink, Spark, Databricks, and AWS (ECS, Lambda, DynamoDB, WAF, among others). Experience in these areas is preferred but not required. Qualifications: You collaborate with team members and project More ❯
processes Knowledge of Mifid II, Dodd Frank regulations and controls Knowledge/experience of FIX flows - TradeWeb, RFQ Hub, BlackRock and Bloomberg, RFQ Workflows. Additional technology experience: React JS, Apache NiFi, Mongo, DBaaS, SaaS, Tibco/Solace or similar messaging middleware Education: Bachelor's degree or equivalent experience operating in a similar role This job description provides a high More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can More ❯
under pressure. Skills: Programming Languages: Strong proficiency in Python and PySpark. Database Management: Expertise in SQL for data manipulation and querying. Data Orchestration: Experience with orchestration tools such as Apache Airflow or Dagster. Containerization: Familiarity with containerization technologies, specifically Kubernetes and Docker. Data Pipelines: Proven experience in designing and implementing data pipelines, working with big data technologies and architectures. More ❯
class-leading data and ML platform infrastructure, balancing maintenance with exciting greenfield projects. develop and maintain our real-time model serving infrastructure, utilising technologies such as Kafka, Python, Docker, Apache Flink, Airflow, and Databricks. Actively assist in model development and debugging using tools like PyTorch, Scikit-learn, MLFlow, and Pandas, working with models from gradient boosting classifiers to custom More ❯
Computer Science, Engineering, or a related field, or equivalent industry experience. Preferred Qualifications Experience or interest in mentoring junior engineers. Familiarity with data-centric workflows and pipeline orchestration (e.g., Apache Airflow). Proficiency in data validation, anomaly detection, or debugging using tools like Pandas, Polars, or data.table/R. Experience working with AWS or other cloud platforms. Knowledge of More ❯
IIBA (International Institute of Business Analysis)
data-driven performance analysis and optimization. Strong communication skills and ability to work in a team. Strong analytical and problem-solving skills. PREFERRED QUALIFICATIONS Experience with Kubernetes deployment architectures. Apache NiFi experience. Experience building trading controls within an investment bank. About Goldman Sachs At Goldman Sachs, we commit our people, capital, and ideas to help our clients, shareholders, and More ❯
flat structure, you'll work autonomously on business critical projects, and collaborate with others throughout our team and user base. Our tech stack includes TypeScript, Python, Node, WebAssembly, WebGL, Apache Arrow, DuckDB, Kubernetes and React. For the best possible user experience, we have developed various technologies in-house, including a custom WebGL rendering engine, data visualization library, reactive SQL More ❯