tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or experience in building internal data communities or enablement programs Working with diverse data sources (APIs, CRMs, SFTP, databases) and formats (Parquet, JSON, XML, CSV) Exposure to machine learning models or AI agents Why Join Us Help shape the future of data in an organization that treats data as a product More ❯
Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with AWS, S3, SQS, Iceberg, Parquet, Glue and EMR for our Data Lake Experience developing CI/CD pipelines More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
hands-on AWS experience – S3, Redshift, Glue essential. Proven experience building ETL/ELT pipelines in cloud environments. Proficient in working with structured/unstructured data (JSON, XML, CSV, Parquet). Skilled in working with relational databases and data lake architectures. Experienced with Matillion and modern data visualisation tools (QuickSight, Tableau, Looker, etc.). Strong scripting and Linux/ More ❯
hands-on AWS experience – S3, Redshift, Glue essential. Proven experience building ETL/ELT pipelines in cloud environments. Proficient in working with structured/unstructured data (JSON, XML, CSV, Parquet). Skilled in working with relational databases and data lake architectures. Experienced with Matillion and modern data visualisation tools (QuickSight, Tableau, Looker, etc.). Strong scripting and Linux/ More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg, Power Automate) and/or Power Apps is desirable Excellent interpersonal and communication skills with the ability to work cross-functionally and More ❯
processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg, Power Automate) and/or Power Apps is desirable Excellent interpersonal and communication skills with the ability to work cross-functionally and More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg, Power Automate) and/or Power Apps is desirable Excellent interpersonal and communication skills with the ability to work cross-functionally and More ❯
data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to learning new technologies, methodologies, and skills As the successful Data Engineering Manager you will be responsible for: Building and maintaining data pipelines Identifying and More ❯
data Proficiency with Linux development, Git, containers, and CI/CD workflows Familiarity with SQL and at least one columnar or time-series data store (e.g., kdb+, ClickHouse, InfluxDB, Parquet/Arrow) Excellent problem-solving abilities, attention to detail, and clear communication skills Nice To Have: Prior exposure with execution algos, TCA, order-routing, or market-impact modelling Knowledge More ❯
With solid software engineering fundamentals, fluent in Java and Python (Rust is a plus). Knowledgeable about data lake systems like Athena, and big data storage formats such as Parquet, HDF5, ORC, focusing on data ingestion. Driven by working in an intellectually engaging environment with top industry minds, where constructive debates are encouraged. Excited about working in a start More ❯
Bonus Points For Workflow orchestration tools like Airflow. Working knowledge of Kafka and Kafka Connect. Experience with Delta Lake and lakehouse architectures. Proficiency in data serialization formats: JSON, XML, PARQUET, YAML. Cloud-based data services experience. Ready to build the future of data? If you're a collaborative, forward-thinking engineer who wants to work on meaningful, complex problems More ❯
Bonus Points For Workflow orchestration tools like Airflow. Working knowledge of Kafka and Kafka Connect. Experience with Delta Lake and lakehouse architectures. Proficiency in data serialization formats: JSON, XML, PARQUET, YAML. Cloud-based data services experience. Ready to build the future of data? If you're a collaborative, forward-thinking engineer who wants to work on meaningful, complex problems More ❯
Job Description AWS Stack, data being landed in S3, Lambda triggers, Data Quality, data written back out to AWS S3(Parquet Formats), Snowflake for dimensional model. Design and build the data pipelines, work with someone around understanding data transformation, this is supported by BA's, building out the data pipelines, moving into layers in the data architecture (Medallion architecture More ❯
Job Description AWS Stack, data being landed in S3, Lambda triggers, Data Quality, data written back out to AWS S3(Parquet Formats), Snowflake for dimensional model. Design and build the data pipelines, work with someone around understanding data transformation, this is supported by BA's, building out the data pipelines, moving into layers in the data architecture (Medallion architecture More ❯
similar) Interest in distributed systems, database internals, or storage engines Product sense: you care about how infrastructure gets used Bonus if you've worked with tech like Apache Arrow, Parquet, DataFusion, Clickhouse, DuckDB What's on offer: £120k-£150k base + meaningful equity Full-time, on-site in Shoreditch (Monday-Friday) A chance to do foundational work with real More ❯