across SQL Server, PostgreSQL, and cloud databases Proven track record with complex data migration projects (terabyte+ datasets, multiple legacy source systems, structures and unstructured data) Proficiency with Parquet/DeltaLake or other modern data storage formats Experience with streaming architectures using Kafka, Event Hubs, or Kinesis for real-time data processing Knowledge of data architectures supporting AI More ❯
facing consulting environment, with the ability to manage stakeholder expectations, navigate complex requirements, and deliver tailored solutions across diverse industries. 5+ years' experience working with Databricks, including Spark and DeltaLake Strong skills in Python and/or Scala for data engineering tasks Comfortable working with cloud platforms like Azure, AWS, and/or Google Cloud A problem More ❯
with the ability to manage stakeholder expectations, navigate complex requirements, and deliver tailored solutions across diverse industries. 7 years' experience working with Databricks. Good hands on experience with Spark, DeltaLake, and Unity Catalog Strong understanding of cloud platforms like Azure, AWS and/or Google Cloud Experience designing data lakes, lakehouses, and modern data platforms Proven experience More ❯
Spark, Kafka, and AWS Glue/EMR. Architect storage and processing layers using Parquet and Iceberg for schema evolution, partitioning, and performance optimization. Integrate AWS data services (S3, Redshift, Lake Formation, Kinesis, Lambda, DynamoDB) into enterprise solutions. Ensure data governance, lineage, cataloging, and security compliance in line with financial regulations (Basel III, MiFID II, Dodd-Frank). Partner with … technical architecture. Provide technical leadership and guidance to engineering teams. Required Skills & Experience: Core Technical Expertise Strong hands-on skills in AWS Data Services (S3, Redshift, Glue, EMR, Kinesis, Lake Formation, DynamoDB). Expertise in Apache Kafka (event streaming) and Apache Spark (batch and streaming). Proficiency in Python for data engineering and automation. Strong knowledge of Parquet, Iceberg … Knowledge Experience with trading systems, market data feeds, risk analytics, and regulatory reporting. Familiarity with time-series data, reference/master data, and real-time analytics. Preferred Exposure to DeltaLake, DBT, Databricks, or Snowflake. AWS Certifications (Solutions Architect - Professional, Data Analytics Specialty). Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. More ❯
ML pipelines and infrastructure Expertise in MLOps practices, including model lifecycle management, versioning, monitoring, and CI/CD for ML Experience with big data ecosystems (e.g., Spark, Hive, Databricks, DeltaLake) and streaming technologies Proficient in working with ML frameworks like TensorFlow, PyTorch, XGBoost, or similar Experience working in cloud-based environments (AWS, GCP, or Azure) and with More ❯
models and reports. Experience required: Strong background in data engineering, warehousing, and data quality. Proficiency in Microsoft 365, Power BI, and other BI tools Familiarity with Azure Databricks and DeltaLake is desirable. Ability to work autonomously in a dynamic environment and contribute to team performance. Strong communication, influencing skills, and a positive, can-do attitude. Knowledge of More ❯
the technical lead and design authority Ability to partner with and influence senior client stakeholders to drive the programme to the required outcomes Hands on experience of Databricks including DeltaLake and Unity Catalog Experience of cloud architectures. We favour Azure and AWS. You have guided data engineers and analysts through optimising their workloads and take FinOps at More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯
engineering experience Strong Kubernetes and Docker knowledge Confident scripting in Python and Bash Experience with secure or air-gapped environments Familiarity with HPC or distributed data systems (e.g. Spark, DeltaLake) Knowledge of security, encryption, and compliance standards TO BE CONSIDERED: Please either apply through this advert or email me directly at . For further information, call me More ❯
multiple time zones. Ability to foster a collaborative learning environment that encourages continuous improvement and knowledge sharing. Nice to Have: Familiarity with technologies such as Elasticsearch, Solr, PostgreSQL, Databricks, Delta Share, and Delta Lake. Experience working with complex patent and litigation data models. Exposure to external data sources such as DocDB, Espacenet, and USPTO. Proficiency with Pandas and More ❯