Expertise in data warehousing, data modelling, and data integration. Experience in MLOps and machine learning pipelines. Proficiency in SQL and data manipulation languages. Experience with big data platforms (including Apache Arrow, Apache Spark, ApacheIceberg, and Clickhouse) and cloud-based infrastructure on AWS. Education & Qualifications Bachelors or Masters degree in Computer Science, Engineering, or a related More ❯
able to work across full data cycle. • Proven Experience working with AWS data technologies (S3, Redshift, Glue, Lambda, Lake formation, Cloud Formation), GitHub, CI/CD • Coding experience in Apache Spark, Iceberg or Python (Pandas) • Experience in change and release management. • Experience in Database Warehouse design and data modelling • Experience managing Data Migration projects. • Cloud data platform development … the AWS services like Redshift, Lambda,S3,Step Functions, Batch, Cloud formation, Lake Formation, Code Build, CI/CD, GitHub, IAM, SQS, SNS, Aurora DB • Good experience with DBT, ApacheIceberg, Docker, Microsoft BI stack (nice to have) • Experience in data warehouse design (Kimball and lake house, medallion and data vault) is a definite preference as is knowledge More ❯
Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, ApacheIceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing More ❯
Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, ApacheIceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing More ❯
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with ApacheIceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research More ❯
MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with ApacheIceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field #LI-PM1 Get to know us better... YouGov is a global More ❯
Terraform and Kubernetes is a plus! A genuine excitement for significantly scaling large data systems Technologies we use (experience not required): AWS serverless architectures Kubernetes Spark Flink Databricks Parquet. Iceberg, Delta lake, Paimon Terraform Github including Github Actions Java PostgreSQL About Chainalysis Blockchain technology is powering a growing wave of innovation. Businesses and governments around the world are using More ❯
Terraform and Kubernetes is a plus! A genuine excitement for significantly scaling large data systems Technologies we use (experience not required): AWS serverless architectures Kubernetes Spark Flink Databricks Parquet. Iceberg, Delta lake, Paimon Terraform Github including Github Actions Java PostgreSQL About Chainalysis Blockchain technology is powering a growing wave of innovation. Businesses and governments around the world are using More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
solve any given problem. Technologies We Use A variety of languages, including Java, Python, Rust and Go for backend and Typescript for frontend Open-source technologies like Cassandra, Spark, Iceberg, ElasticSearch, Kubernetes, React, and Redux Industry-standard build tooling, including Gradle for Java, Cargo for Rust, Hatch for Python, Webpack & PNPM for Typescript What We Value Strong engineering background More ❯
Experience with real-time analytics from telemetry and event-based streaming (e.g., Kafka) Experience managing operational data stores with high availability, performance, and scalability Expertise in data lakes, lakehouses, ApacheIceberg, and data mesh architectures Proven ability to build, deliver, and support modern data platforms at scale Strong knowledge of data governance, data quality, and data cataloguing Experience … with modern database technologies, including Iceberg, NoSQL, and vector databases Embraces innovation and works closely with scientists and partners to explore cutting-edge technology Knowledge of master data, metadata, and reference data management Understanding of Agile practices and sprint-based methodologies Active contributor to knowledge sharing and collaboration Desirable Knowledge, Skills and Experience: Familiarity with genomics and associated data More ❯
and scalable environments for our data platforms. Leveraging cloud-native technologies and AWS tools such as AWS S3, EKS, Glue, Airflow, Trino, and Parquet, you will prepare to adopt ApacheIceberg for greater performance and flexibility. You'll address high-performance data workloads, ensuring seamless execution of massive queries, including 600+ billion-row queries in Redshift, by designing More ❯
Science or a related field. Experience working on and shipping live service games. Experience working on Spring Boot projects. Experience deploying software/services on Kubernetes. Experience working with Apache Spark and Iceberg. More ❯
Terraform or Ansible for deployment and infrastructure management Hands-on experience with; ETL/ELT orchestration and pipeline tools (Airflow, Airbyte, DBT, etc.) Data warehousing tools and platforms (Snowflake, Iceberg, etc.) SQL databases, particularly MySQL Desired Experience: Experience with cloud-based services, particularly AWS Proven ability to manage stakeholders, their expectations and explain complex problems or solutions in a More ❯
pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
in a hybrid environment requiring clear and effective communication. Strong engineering fundamentals with a passion for simplicity and precision Ideal, But Not Required Experience with database technologies (Postgres, DynamoDB, ApacheIceberg). Experience with serverless technologies (e.g. Lambda) Required Experience Prior industry experience with Python. Prior industry experience with public cloud providers (preferably AWS). Our Offer Work More ❯
DBT, and data governance frameworks Preferred: Certifications in cloud/data technologies Experience with API/interface modelling and CI/CD (e.g. GitHub Actions) Knowledge of Atlan and iceberg tables Reference: AMC/SCU/SDA/3007 Postcode: SW1 #secu More ❯
PowerDesigner Strong SQL and Python skills (Snowflake or similar) AWS experience (Lambda, SNS, S3, EKS, API Gateway) Familiarity with data governance (GDPR, HIPAA) Bonus points for: DBT, Airflow, Atlan, Iceberg, CI/CD, API modelling The vibe: You’ll be joining a collaborative, inclusive team that values technical excellence and continuous learning. Flexible working, strong L&D support, and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
PowerDesigner Strong SQL and Python skills (Snowflake or similar) AWS experience (Lambda, SNS, S3, EKS, API Gateway) Familiarity with data governance (GDPR, HIPAA) Bonus points for: DBT, Airflow, Atlan, Iceberg, CI/CD, API modelling The vibe: You’ll be joining a collaborative, inclusive team that values technical excellence and continuous learning. Flexible working, strong L&D support, and More ❯
external fundamental and alternative market data. What you offer Experience with AWS cloud. Experience programming, debugging, and running production systems in Python. Exposure to open-source technologies such as Iceberg, Trino, and Airflow. Passionate about the use and adoption of these capabilities, focused on user experience and ensuring our business sees real value from the technical deliveries. A learning More ❯
development framework. Be familiar with Data Build Tool (DBT) for building data models, tests, and transformations. Have a thorough understanding of distributed file and table formats like Parquet, Delta, Iceberg, Hudi. Preferred Experience with Infrastructure as Code (IaC) solutions such as Terraform or Pulumi. Experience with modern CI/CD DevOps frameworks. Experience developing data visualizations using Power BI More ❯
experience Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with AWS, S3, SQS, Iceberg, Parquet, Glue and EMR for our Data Lake Experience developing CI/CD pipelines More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad More ❯
evolution Has experience (or strong interest) in building real-time or event-driven architectures Modern Data Stack Includes: Python , SQL Snowflake , Postgres AWS (S3, ECS, Terraform) Airflow , dbt , Docker Apache Spark , Iceberg What they're looking for: Solid experience as a Senior/Lead/Principal Data Engineer, ideally with some line management or mentoring Proven ability to More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with More ❯
with Interface/API data modelling. Experience with CI/CD GITHUB Actions (or similar) AWS fundamentals (e.g., AWS Certified Data Engineer) Knowledge of Snowflake/SQL Knowledge of Apache Airflow Knowledge of DBT Familiarity with Atlan for data catalog and metadata management Understanding of iceberg tables Who we are: Were a business with a global reach that More ❯