years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You'll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
Warrington, Cheshire, England, United Kingdom Hybrid / WFH Options
Brookson
Science, Mathematics, Engineering or other STEM A strong team player with empathy, humility and dedication to joint success and shared development. Desirable Experience and Qualifications: Experience with Databricks or DeltaLake architecture. Experience building architecture and Data Warehousing within the Microsoft Stack Experience in development Source control (e.g. Bit Bucket, Github) Experience in Low Code Analytical Tools (e.g. More ❯
Oak Brook, Illinois, United States Hybrid / WFH Options
Ace Hardware Corporation
performance and cost across the Databricks platform. Collaborate with teams to support data pipelines, ML workflows, and analytics use cases. Integrate Databricks with cloud storage solutions (S3, ADLS) and Delta Lake. Troubleshoot issues related to jobs, performance, or user access. Maintain platform documentation and conduct user training. Stay current with Databricks updates, new features, and best practices. Support hybrid … administering Cloudera Hadoop clusters. 2-3+ years of Databricks experience in production environments. 2+ years of Databricks administration experience on Azure (preferred). Strong knowledge of Spark and DeltaLake architecture. Experience with IAM, Active Directory, and SSO integration. Familiarity with DevOps and CI/CD for data platforms. Deep understanding of Hadoop ecosystem: Hive, Impala, Spark … Python for automation and system administration. Solid foundation in Linux/Unix system administration. Preferred Qualifications Experience with cloud technologies (Azure or GCP preferred). Exposure to modern data lake and hybrid cloud architectures. Cloudera and/or Databricks certifications. Familiarity with infrastructure-as-code and automation tools (e.g., Terraform). Experience supporting Oracle database administration (e.g., backups, user More ❯
of Databricks jobs, notebooks and configurations across environments. Manage version control for Databricks artifacts and collaborate with team to maintain development best practices. Technical Skills: Strong expertise in Databricks (DeltaLake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc Proficiency in Azure Cloud Services. Solid Understanding of Spark and PySpark for big data More ❯
across SQL Server, PostgreSQL, and cloud databases Proven track record with complex data migration projects (terabyte+ datasets, multiple legacy source systems, structures and unstructured data) Proficiency with Parquet/DeltaLake or other modern data storage formats Experience with streaming architectures using Kafka, Event Hubs, or Kinesis for real-time data processing Knowledge of data architectures supporting AI More ❯
with the ability to manage stakeholder expectations, navigate complex requirements, and deliver tailored solutions across diverse industries. 7 years' experience working with Databricks. Good hands on experience with Spark, DeltaLake, and Unity Catalog Strong understanding of cloud platforms like Azure, AWS and/or Google Cloud Experience designing data lakes, lakehouses, and modern data platforms Proven experience More ❯
Spark, Kafka, and AWS Glue/EMR. Architect storage and processing layers using Parquet and Iceberg for schema evolution, partitioning, and performance optimization. Integrate AWS data services (S3, Redshift, Lake Formation, Kinesis, Lambda, DynamoDB) into enterprise solutions. Ensure data governance, lineage, cataloging, and security compliance in line with financial regulations (Basel III, MiFID II, Dodd-Frank). Partner with … technical architecture. Provide technical leadership and guidance to engineering teams. Required Skills & Experience: Core Technical Expertise Strong hands-on skills in AWS Data Services (S3, Redshift, Glue, EMR, Kinesis, Lake Formation, DynamoDB). Expertise in Apache Kafka (event streaming) and Apache Spark (batch and streaming). Proficiency in Python for data engineering and automation. Strong knowledge of Parquet, Iceberg … Knowledge Experience with trading systems, market data feeds, risk analytics, and regulatory reporting. Familiarity with time-series data, reference/master data, and real-time analytics. Preferred Exposure to DeltaLake, DBT, Databricks, or Snowflake. AWS Certifications (Solutions Architect - Professional, Data Analytics Specialty). Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. More ❯
and regulatory compliance. Provide technical leadership to engineering teams and contribute to architecture standards. Skills & Experience Proven experience as a Data/Solution Architect with strong expertise in Databricks (DeltaLake, MLflow, Spark) . Deep understanding of commercial insurance processes (underwriting, claims, personal/specialty lines). Strong background in Azure (Synapse, Data Factory, Fabric) or similar cloud More ❯
and regulatory compliance. Provide technical leadership to engineering teams and contribute to architecture standards. Skills & Experience Proven experience as a Data/Solution Architect with strong expertise in Databricks (DeltaLake, MLflow, Spark) . Deep understanding of commercial insurance processes (underwriting, claims, personal/specialty lines). Strong background in Azure (Synapse, Data Factory, Fabric) or similar cloud More ❯
and regulatory compliance. Provide technical leadership to engineering teams and contribute to architecture standards. Skills & Experience Proven experience as a Data/Solution Architect with strong expertise in Databricks (DeltaLake, MLflow, Spark) . Deep understanding of commercial insurance processes (underwriting, claims, personal/specialty lines). Strong background in Azure (Synapse, Data Factory, Fabric) or similar cloud More ❯
and regulatory compliance. Provide technical leadership to engineering teams and contribute to architecture standards. Skills & Experience Proven experience as a Data/Solution Architect with strong expertise in Databricks (DeltaLake, MLflow, Spark) . Deep understanding of commercial insurance processes (underwriting, claims, personal/specialty lines). Strong background in Azure (Synapse, Data Factory, Fabric) or similar cloud More ❯
london (city of london), south east england, united kingdom
Careerwise
and regulatory compliance. Provide technical leadership to engineering teams and contribute to architecture standards. Skills & Experience Proven experience as a Data/Solution Architect with strong expertise in Databricks (DeltaLake, MLflow, Spark) . Deep understanding of commercial insurance processes (underwriting, claims, personal/specialty lines). Strong background in Azure (Synapse, Data Factory, Fabric) or similar cloud More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Build robust feature pipelines using Airflow, Prefect, or Dagster Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or DeltaLake) Automate and scale model training pipelines in close partnership with ML engineers Deploy, observe, and improve pipelines using Docker, Kubernetes, Terraform, or dbt Champion data reliability, scalability More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Build robust feature pipelines using Airflow, Prefect, or Dagster Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or DeltaLake) Automate and scale model training pipelines in close partnership with ML engineers Deploy, observe, and improve pipelines using Docker, Kubernetes, Terraform, or dbt Champion data reliability, scalability More ❯
Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices. 1+ years of experience with DataBricks 1+ years of experience in Lakehouse, Iceberg or Delta lake. At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration related support for this position (i.e. H1B, F-1 OPT More ❯
Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices. 1+ years of experience with DataBricks 1+ years of experience in Lakehouse, Iceberg or Delta lake. At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration related support for this position (i.e. H1B, F-1 OPT More ❯
Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices. 1+ years of experience with DataBricks 1+ years of experience in Lakehouse, Iceberg or Delta lake. At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration related support for this position (i.e. H1B, F-1 OPT More ❯
Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices. 1+ years of experience with DataBricks 1+ years of experience in Lakehouse, Iceberg or Delta lake. At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration related support for this position (i.e. H1B, F-1 OPT More ❯
Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices. 1+ years of experience with DataBricks 1+ years of experience in Lakehouse, Iceberg or Delta lake. At this time, Capital One will not sponsor a new applicant for employment authorization, or offer any immigration related support for this position (i.e. H1B, F-1 OPT More ❯
this position is permitted to work at a Southfield, Michigan office location if requested by the team member. Design and implement core components of the data platform (e.g., data lake, streaming infrastructure, DaaS, catalog), emphasizing scalability, reliability, and observability. Balance hands-on delivery with architectural foresight, contributing to cross-functional initiatives that strengthen the platform. Partner with data and … solutions Experience building and operating applications on cloud platforms (e.g., AWS, Azure, or GCP), including deploying and supporting containerized services (Docker, Kubernetes, ECS/EKS) Familiarity with lakehouse principles (DeltaLake, Iceberg, or Hudi) and best practices for schema evolution, versioning, and performance optimization Experience with observability practices (metrics, logs, tracing, alerting) and tools (e.g., Dynatrace, Splunk, CloudWatch … coach less experienced engineers, contributing to team growth and technical maturity Familiarity with Agile delivery practices and other software development lifecycle methodologies Preferred: Hands-on experience with lakehouse technologies (Delta, Iceberg, Hudi), beyond conceptual familiarity Exposure to workflow orchestration frameworks (Airflow, Dagster, Prefect, Databricks Workflows) Experience with CI/CD pipelines for automated testing and deployment Exposure to observability More ❯