Excellent stakeholder engagement and communication skills — confident working cross-functionally and enabling others to use and trust the data Desirable: Experience with Lakehouse patterns; setting standards, quality checks etc. Parquet partitioning and clustering to optimise for query speed and cost GitHub Actions REST APIs ML Ops tooling What you can expect 25 days holiday excluding public holidays increasing after More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Amtis Professional Ltd
continuously improve data infrastructure Explore AI-driven enhancements to boost data accuracy and productivity Requirements: Strong experience with: Azure Databricks, Data Factory, Blob Storage Python/PySpark SQL Server, Parquet, Delta Lake Deep understanding of: ETL/ELT, CDC, stream processing Lakehouse architecture and data warehousing Scalable pipeline design and database optimisation A proactive mindset, strong problem-solving skills More ❯
for continuous deployment AWS cloud infrastructure Kubernetes for data services and task orchestration Google Analytics, Amplitude and Firebase for client applications event processing Airflow for job scheduling and tracking Parquet and Delta file formats on S3 for data lake storage Streamlit for data applications Why else you'll love it here Wondering what the salary for this role is More ❯
market data feeds (Bloomberg, Refinitiv, etc). Familiarity with containerisation (Docker) and cloud platforms (AWS/GCP). Database experience (SQL/NoSQL) and use of modern data formats (Parquet, HDF5). Contract: Initial term with possible extension Day Rate: Competitive, depending on experience McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds. More ❯
such as GitHub Actions or Jenkins. Solid grounding in modern engineering principles and full-stack development. Bonus Skills: Airflow, Kafka/Kafka Connect, Delta Lake, JSON/XML/Parquet/YAML, cloud-based data services. Why Apply? Work for a global payments innovator shaping the future of commerce. Join a highly skilled, collaborative, and forward-thinking data team. More ❯
such as GitHub Actions or Jenkins. Solid grounding in modern engineering principles and full-stack development. Bonus Skills: Airflow, Kafka/Kafka Connect, Delta Lake, JSON/XML/Parquet/YAML, cloud-based data services. Why Apply? Work for a global payments innovator shaping the future of commerce. Join a highly skilled, collaborative, and forward-thinking data team. More ❯
our datalake platform Kubernetes for data services and task orchestration Terraform for infrastructure Streamlit for data applications Airflow purely for job scheduling and tracking Circle CI for continuous deployment Parquet and Delta file formats on S3 for data lake storage Spark for data processing DBT for data modelling SparkSQL for analytics Why else you'll love it here Wondering More ❯
an expert BS degree in Computer Science or meaningful relevant work experience Preferred Qualifications Experience with large scale data platform infrastructure such as Spark, Flink, HDFS, AWS/S3, Parquet, Kubernetes is a plus More ❯
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯
in peer reviews. Collaborate with stakeholders and researchers to support analytics and product development. Integrate data from APIs, S3 buckets, and structured/unstructured sources (JSON, CSV, Excel, PDF, Parquet). Join geospatial datasets with external data sources and apply complex transformations. Define validated data schemas and create clear documentation for partners and teams. Explore and evaluate new data More ❯
time for your personal development What you'll be working with: •Backend: Distributed, event-driven core Java (90% of the code-base), MySQL, Kafka •Data analytics: Python & Jupyter notebooks, Parquet, Docker •Testing: JUnit, JMH, JCStress, Jenkins, Selenium, many in-house tools •OS: Linux (Fedora for development, Rocky in production) The LMAX way is to use the right tool for More ❯
Location London, United Kingdom Employment Type Full time Location Type Remote Department R&D Investigations The engineering team at Chainalysis is inspired by solving the hardest technical challenges and creating products that build trust in cryptocurrencies. We're a global More ❯