Agile software development framework. Be familiar with Data Build Tool (DBT) for building data models, tests, and transformations. Have a thorough understanding of distributed file and table formats like Parquet, Delta, Iceberg, Hudi. Preferred Experience with Infrastructure as Code (IaC) solutions such as Terraform or Pulumi. Experience with modern CI/CD DevOps frameworks. Experience developing data visualizations using More ❯
file drops, message queues (SQS, Kafka), and 3rd party SaaS integrations, with idempotency and error handling. Storage & Query Engines: Strong with RDBMS (PostgreSQL, MySQL), NoSQL (DynamoDB, Cassandra), data lakes (Parquet, ORC), and warehouse paradigms. Observability & Quality: Deep familiarity with metrics, logging, tracing, and data quality tools (e.g., Great Expectations, Monte Carlo, custom validation/test suites). Security & Governance More ❯
Data modelling (building optimised and efficient data marts and warehouses in the cloud) Work with Infrastructure as code (Terraform) and containerising applications (Docker) Work with AWS, S3, SQS, Iceberg, Parquet, Glue and EMR for our Data Lake Experience developing CI/CD pipelines More information: Enjoy fantastic perks like private healthcare & dental insurance, a generous work from abroad policy More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Datatech Analytics
processing and automation Solid understanding of ETL/ELT workflows, data modelling, and structuring datasets for analytics Experience working with large, complex datasets and APIs across formats (CSV, JSON, Parquet, etc) Familiarity with workflow automation tools (eg, Power Automate) and/or Power Apps is desirable Excellent interpersonal and communication skills with the ability to work cross-functionally and More ❯
data warehousing (e.g. Hadoop, Spark, Redshift, Snowflake, GCP BigQuery) Expertise in building data architectures that support batch and streaming paradigms Experience with standards such as JSON, XML, YAML, Avro, Parquet Strong communication skills Open to learning new technologies, methodologies, and skills As the successful Data Engineering Manager you will be responsible for: Building and maintaining data pipelines Identifying and More ❯
Data Lake and create secure, efficient, and scalable environments for our data platforms. Leveraging cloud-native technologies and AWS tools such as AWS S3, EKS, Glue, Airflow, Trino, and Parquet, you will prepare to adopt Apache Iceberg for greater performance and flexibility. You'll address high-performance data workloads, ensuring seamless execution of massive queries, including 600+ billion-row More ❯
Birmingham, Staffordshire, United Kingdom Hybrid / WFH Options
Internetwork Expert
Capture (CDC) and change tracking Stream processing Database design Machine Learning and AI integration Hands-on experience with: Azure Databricks Python/PySpark Microsoft SQL Server Azure Blob Storage Parquet file formats Azure Data Factory Proven experience building secure, scalable, and high-performing data pipelines. Ability to solve complex technical problems and work collaboratively across teams. Excellent communication and More ❯
our datalake platform Kubernetes for data services and task orchestration Terraform for infrastructure Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data modelling SparkSQL for analytics Why else you'll love it here Wondering More ❯
Nursling, Southampton, Hampshire, England, United Kingdom Hybrid / WFH Options
Ordnance Survey
Survey Testing Community, with common standards such as metrics and use of test tools Here is a snapshot of the technologies that we use Scala, Apache Spark, Databricks, ApacheParquet, YAML, Azure Cloud Platform, Azure DevOps (Test plans, Backlogs, Pipelines), GIT, GeoJSON What we're looking for Highly skilled in creating, maintaining and peer reviewing test automation code, preferably More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, Delta Lake Data Warehousing ETL CDC Stream Processing Database Design ML Python/PySpark Azure Blob Storage Parquet Azure Data Factory Desirable: Any exposure working in a software house, consultancy, retail or retail automotive sector would be beneficial but not essential. More ❯
and RAG-based solutions Proficiency in Python and modern AI/ML libraries (e.g. HuggingFace, LangChain, TensorFlow, PyTorch) Experience with data exchange and storage frameworks (e.g. APIs, SQL, NoSQL, Parquet) Track record of delivering technical solutions in Agile environments Excellent communication skills and a collaborative mindset Beneficial, but not essential: Experience with containerisation (Docker) Awareness of secure data handling More ❯
with a keen understanding of LLM and RAG technologies. Strong development capabilities, particularly in Python. Experience with data exchange, processing, and storage frameworks (ETL, ESB, API, SQL, NoSQL, and Parquet). Comfort with Agile development methodologies. Excellent teamwork and communication skills, with a talent for translating technical concepts into actionable insights for non-specialists. Ability to influence company decision More ❯
our datalake platform Kubernetes for data services and task orchestration Terraform for infrastructure Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data modelling SparkSQL for analytics Why else you'll love it here Wondering More ❯
our datalake platform Kubernetes for data services and task orchestration Terraform for infrastructure Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data modelling SparkSQL for analytics Why else you'll love it here Wondering More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
All our office locations considered: Newbury & Liverpool (UK); Šibenik, Croatia (considered) We're on the hunt for builders . No, we've not ventured into construction in our quest to conquer the world, rather a designer and builder of systems More ❯
About the role Taktile is a high-growth, post product-market-fit start-up, on a fast trajectory to becoming market leader in the field of automated decisioning. We are looking for a Full-stack Engineer to join the Decide More ❯
Python & C# Developer - Front Office Risk (VP) Commodities Tech Overview CITIGROUP is one of the largest players in the financial industry, employing more than 200,000 people across the globe. What We do/The Team The Commodities trading business More ❯
The engineering team at Chainalysis is inspired by solving the hardest technical challenges and creating products that build trust in cryptocurrencies. Were a global organization with teams in Denmark, UK, Canada, the USA who thrive on the challenging work we More ❯
The engineering team at Chainalysis is inspired by solving the hardest technical challenges and creating products that build trust in cryptocurrencies. We're a global organization with teams in Denmark, UK, Canada, the USA who thrive on the challenging work More ❯