sFTP protocols. ETL/ELT Pipelines : Design and optimize data pipelines using Azure Data Factory and Databricks Medallion Architecture : Implement Bronze, Silver, and Gold layers using formats like Delta , Parquet , and JSON for data transformation. Data Modeling : Develop and optimize data models using star schema and slowly changing dimensions for analytics and operations. Data Governance : Ensure robust data security … Azure Data Engineer. Technical Expertise : Proficiency with Azure Data Factory , Databricks and Azure Storage . Strong skills in SQL , Python , and data modeling techniques. Familiarity with data formats like Parquet and JSON. Experience with AI/ML model management on Azure Databricks . Education : Bachelor's degree in IT, Computer Science, or a related field. Microsoft Certified: Azure Data More ❯
London, England, United Kingdom Hybrid / WFH Options
LHV Bank
tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or experience in building internal data communities or enablement programs Working with diverse data sources (APIs, CRMs, SFTP, databases) and formats (Parquet, JSON, XML, CSV) Exposure to machine learning models or AI agents Why Join Us Help shape the future of data in an organization that treats data as a product More ❯
Derby, England, United Kingdom Hybrid / WFH Options
Cooper Parry
with Power BI, semantic modelling, and DAX Strong SQL and data manipulation skills. Exposure to Python and PySpark is required. Experience working with open data formats like Delta Lake, Parquet, Json, Csv. Familiarity with CI/CD pipelines, version control (e.g., Git), and deployment automation tools Bonus points if you have: Exposure to MuleSoft or other API integration tools More ❯
tools (QuickSight, Power BI, Tableau, Looker, etc.) Interest or experience in building internal data communities or enablement programs Working with diverse data sources (APIs, CRMs, SFTP, databases) and formats (Parquet, JSON, XML, CSV) Exposure to machine learning models or AI agents Why Join Us Help shape the future of data in an organization that treats data as a product More ❯
London, England, United Kingdom Hybrid / WFH Options
BCW
Java, R, DAX, M is an advantage. SQL and NoSQL Databases and Data Lakes: Familiarity with SQL and NoSQL databases such as MongoDB, and in file formats such as Parquet for data storage and retrieval. Data Preparation and Processing Techniques: Understanding of data preparation and processing techniques such as data cleaning, normalization, data imputation, feature engineering, and data transformation More ❯
Herndon, Virginia, United States Hybrid / WFH Options
Maxar Technologies Holdings Inc
Minimum of 3 years' experience with Python. Demonstrated experience building & orchestrating automated, production-level data pipelines and solutions (ETL/ELT). Experience with file-based data storage, including Parquet or Iceberg. Experience with data catalogs (ex. Hive, AWS Glue). General understanding of key AWS services (e.g. EC2, S3, EKS, IAM, lambda). Experience building and/or More ❯
All our office locations considered: Newbury & Liverpool (UK); Šibenik, Croatia (considered) We're on the hunt for builders . No, we've not ventured into construction in our quest to conquer the world, rather a designer and builder of systems More ❯
Newbury, England, United Kingdom Hybrid / WFH Options
Intuita
All our office locations considered: Newbury & Liverpool (UK); Šibenik, Croatia (considered) We're on the hunt for builders . No, we've not ventured into construction in our quest to conquer the world, rather a designer and builder of systems More ❯
Reading, England, United Kingdom Hybrid / WFH Options
Areti Group | B Corp™
Expert knowledge of the Microsoft Fabric Analytics Platform (Azure SQL, Synapse, PowerBI). • Proficient in Python for data engineering tasks, including data ingestion from APIs, creation and management of Parquet files, and execution of ML models. • Strong SQL skills, enabling support for Data Analysts with efficient and performant queries. • Skilled in optimizing data ingestion and query performance for MSSQL More ❯
London, England, United Kingdom Hybrid / WFH Options
Automata
SQL for data processing, analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes of structured and unstructured data from diverse sources. Strong More ❯
London, England, United Kingdom Hybrid / WFH Options
Our Future Health
line knowledge and Unix skills. Good understanding of cloud environments (ideally Azure), distributed computing and optimising workflows and pipelines. Understanding of common data transformation and storage formats, e.g. ApacheParquet, Delta tables. Understanding of containerisation (e.g. Docker) and deployment (e.g. Kubernetes). Working knowledge using Spark, Databricks, Data Lakes. Follow best practices like code review, clean code and unit More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
Experience with alternative data technologies (e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep understanding of file formats and their behaviour such as parquet, delta and iceberg. What we offer We want to give you a great work environment; contribute back to both your personal and professional development; and give you great benefits More ❯
Annapolis Junction, Maryland, United States Hybrid / WFH Options
Halogen Engineering Group, Inc
software, libraries, and packages involving stream/batch data processing and analytic frameworks Experience with data parsing/transformation technologies and file formats including JSON, XML, CSV, TCLD, and Parquet General Cloud and HPC knowledge regarding computer, networking, memory, and storage components Experience with Linux administration including software integration, configuration management and routine O&M operations related to provisioning More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Identify Solutions
Cloud and big data technologies (e.g. Spark/Databricks/Delta Lake/BigQuery). Familiarity with eventing technologies (e.g. Event Hubs/Kafka) and file formats such as Parquet/Delta/Iceberg. Want to learn more? Get in touch for an informal chat. More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Initiate Government Solutions
onsite client meetings as requested. Responsibilities and Duties (Included but not limited to): ETL (Extract, Transform, and Load) to put data into a variety of target formats (text, SQL, Parquet, CSV, MDF, IRIS) Model data tables and make them practical and usable within the evolving data syndication database architecture Design the logical and physical schemas needed to support an More ❯
London, England, United Kingdom Hybrid / WFH Options
Vortexa
AWS, K8s, and Airflow. Fluent in both Java and Python (with Rust being a plus). Knowledgeable about data lake systems like Athena, and big data storage formats like Parquet, HDF5, and ORC, with a focus on data ingestion. Driven by working in an intellectually engaging environment with top minds in the industry, where constructive and friendly challenges are More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
MBDA Missile Systems
various exchange and processing techniques (ETL, ESB, API). Lead the way in delivering Agile methodologies for successful and timely project delivery. Leverage strong database skills (SQL, NoSQL, and Parquet) for efficient data storage and management. What we're looking for from you: Proficiency in Data Science techniques, including statistical models and ML algorithms. Expertise in NLP, with a … keen understanding of LLM and RAG technologies. Strong development capabilities, particularly in Python. Experience with data exchange, processing, and storage frameworks (ETL, ESB, API, SQL, NoSQL, and Parquet). Comfort with Agile development methodologies. Excellent teamwork and communication skills, with a talent for translating technical concepts into actionable insights for non-specialists. Ability to influence company decision-makers and More ❯
Leatherhead, England, United Kingdom Hybrid / WFH Options
JCW
with Azure Integration Services (e.g., Logic Apps, ADF, Service Bus, Functions) Comfortable working with Git , Azure DevOps , and unit testing practices Knowledge of common data formats: CSV, JSON, XML, Parquet Ability to lead integration designs with minimal rework required 🧾 Preferred Qualifications 🎓 Certification in SSIS or relevant Microsoft technologies 💡 Proven track record of delivering robust integration solutions 🧠 Key Skills & Traits More ❯
data Proficiency with Linux development, Git, containers, and CI/CD workflows Familiarity with SQL and at least one columnar or time-series data store (e.g., kdb+, ClickHouse, InfluxDB, Parquet/Arrow) Excellent problem-solving abilities, attention to detail, and clear communication skills Nice To Have: Prior exposure with execution algos, TCA, order-routing, or market-impact modelling Knowledge More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Modeler/Architect London Market Insurance industry experience Proficiency with Data Modelling tools such as Erwin Knowledge of modelling methodologies including Kimball Experience with Data Lake formats such as Parquet and Delta Lake Contract Duration: 6 months Location: London/WFH hybrid If this position interests you, please don’t hesitate to apply. William will contact you in due More ❯