is located in the Washington, DC area and will be a hybrid remote position Responsibilities: Demonstrate hands on experience with Databricks Work in bid data frameworks like, Apache Spark, DeltaLake, and structured streaming Developer ETL/ELT pipelines, data lakehouse architecture, and data modeling Program using Python and SQL, and possibly Scala, and Java Integrate with external … reports from OBIEE/Tableau into Databricks Knowledge of cost optimization and governance strategies in cloud environments. Strong experience in Databricks (including both SQL and PySpark notebooks, Jobs, Workflows, DeltaLake, MLflow). Proficiency in big data frameworks like Apache Spark, DeltaLake, and structured streaming. Hands-on experience with cloud platforms – preferably Azure Databricks, but More ❯
district of columbia, united states Hybrid / WFH Options
Govcio LLC
is located in the Washington, DC area and will be a hybrid remote position Responsibilities: Demonstrate hands on experience with Databricks Work in bid data frameworks like, Apache Spark, DeltaLake, and structured streaming Developer ETL/ELT pipelines, data lakehouse architecture, and data modeling Program using Python and SQL, and possibly Scala, and Java Integrate with external … reports from OBIEE/Tableau into Databricks Knowledge of cost optimization and governance strategies in cloud environments. Strong experience in Databricks (including both SQL and PySpark notebooks, Jobs, Workflows, DeltaLake, MLflow). Proficiency in big data frameworks like Apache Spark, DeltaLake, and structured streaming. Hands-on experience with cloud platforms – preferably Azure Databricks, but More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
Degree in Computer Science, Software Engineering, or similar (applied to Data/Data Specialisation). Extensive experience in Data Engineering, in both Cloud & On-Prem, Big Data and Data Lake environments. Expert knowledge in data technologies, data transformation tools, data governance techniques. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability … monitoring/security is necessary. Significant AWS or Azure hands-on experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, DeltaLake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. To be considered for this role you MUST have in-depth experience … role. KEYWORDS Lead Data Engineer, Senior Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, Cloud, On-Prem, ETL, Azure Data Fabric, ADF, Hadoop , HDFS , Azure Data, DeltaLake, Data Lake Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
on experience of HDFS/Hadoop and on-prem (coding, configuration, automation, delivery, monitoring, security). Extensive Big Data hands-on experience - Cloudera or similar. Data Lakes: Azure Data, DeltaLake, Data Lake or Databricks Lakehouse. Azure - hands-on experience including coding, configuration, automation and delivery. Advanced Database and SQL skills. Certifications: Cloudera, Azure or FME certifications … Company Sponsorship. KEYWORDS Senior Data Engineer, Coding Skills, Spark, Java, Python, PySpark, Scala, ETL Tools, Azure Data Fabric (ADF), Databricks, HDFS, Hadoop, Big Data, Cloudera, Data Lakes, Azure Data, DeltaLake, Data Lake, Databricks Lakehouse, Data Analytics, SQL, Geospatial Data, FME, QGIS, PostGIS. Please note that due to a high level of applications, we can only respond More ❯
cross-functionally with engineering, analytics, and infrastructure teams to transform raw data into valuable enterprise assets. Designing and implementing cloud-native data architectures using Databricks and technologies such as DeltaLake, Spark, and MLflow. Developing and maintaining robust data pipelines, including batch and streaming workloads, to support data ingestion, processing, and consumption. Collaborating with business stakeholders and analytics … is a strong advantage. Bachelor's or master's degree in computer science, Information Systems , Engineering , or a related field. Deep expertise in Databricks , including Spark (PySpark/Scala) , DeltaLake , and orchestration within Databricks workflows. Strong understanding of cloud infrastructure and data services on at least one major cloud platform (Azure preferred, but AWS or GCP also More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Lakehouse/DeltaLake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/ML/Machine Learning/AI/Artificial Intelligence/Based in the West Midlands/Solihull/Birmingham area, Permanent role, £50,000 70,000 + car/allowance + bonus. One … Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks ML/Machine Learning/AI/Artificial Intelligence Microsoft SQL Server Lakehouse, DeltaLake Data Warehousing ETL Database Design Python/PySpark Azure Blob Storage Azure Data Factory More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
and managing data and AI solutions - Experience building ETL pipelines, managing data pipelines, and working with large datasets using tools like Spark, Python, and SQL - Experience with technologies like DeltaLake, Delta Live Tables, and Databricks Workflows - Experience collaborating with data scientists - Familiarity with Advana - Strong Python programming skills - Solid SQL knowledge for querying and data manipulation More ❯
and managing data and AI solutions. Experience building ETL pipelines, managing data pipelines, and working with large datasets using tools like Spark, Python, and SQL. Experience with technologies like DeltaLake, Delta Live Tables, and Databricks Workflows. Clearance Required: Active TS/SCI with CI polygraph AA/Disability/Veteran US Salary Range More ❯
and managing data and AI solutions Experience building ETL pipelines, managing data pipelines, and working with large datasets using tools like Spark, Python, and SQL Experience with technologies like DeltaLake, Delta Live Tables, and Databricks Workflows Some of your day-to-day activities include but not limited to: Writing and maintaining code using an Extract-Transform More ❯
Spark • Strong programming skills in Python, with experience in data manipulation libraries (e.g., PySpark, Spark SQL) • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables • Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems • Proven experience with at least one major cloud platform (Azure … and scalable ETL processes to ingest, transform, and load data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses • Leveraging the Databricks ecosystem (SQL, DeltaLake, Workflows, Unity Catalog) to deliver reliable and performant data workflows • Integrating with cloud services such as Azure, AWS, or GCP to enable secure, cost-effective data solutions More ❯
market and products (we can teach that), but a passion for the transition to net zero is an excellent start What We Work With dbt for data modelling Databricks DeltaLake for data lake and warehouse storage and querying Python as our main programming language Jupyter and Jupyter Hub for notebook analytics and collaboration Circle CI for … deployment AWS cloud infrastructure Kubernetes for data services and task orchestration Google Analytics, Amplitude and Firebase for client applications event processing Airflow for job scheduling and tracking Parquet and Delta file formats on S3 for data lake storage Streamlit for data applications Why else you'll love it here Wondering what the salary for this role is? Just More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Experience: 5 years of experience in a data engineering or similar technical role Hands-on experience with key Microsoft Azure services: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to … Familiarity with software testing methodologies and development team collaboration Experience working with Power BI and DAX Strong documentation, communication, and stakeholder engagement skills Preferred Qualifications: Experience with Lakehouse architecture, DeltaLake, or Databricks Exposure to Agile/Scrum working practices Microsoft certifications (e.g., Azure Data Engineer Associate) Background in consulting or professional services Understanding of data governance and More ❯
ETL/ELT processes, and pipeline orchestration Familiarity with CI/CD and DevOps practices in a data engineering context Excellent communication and stakeholder engagement skills Desirable: Experience with DeltaLake, Power BI, and Azure DevOps Knowledge of data governance, security, and compliance frameworks Exposure to machine learning workflows and MLOps Relevant certifications (e.g. Microsoft Certified: Azure Data More ❯
Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
ETL/ELT processes, and pipeline orchestration Familiarity with CI/CD and DevOps practices in a data engineering context Excellent communication and stakeholder engagement skills Desirable: Experience with DeltaLake, Power BI, and Azure DevOps Knowledge of data governance, security, and compliance frameworks Exposure to machine learning workflows and MLOps Relevant certifications (e.g. Microsoft Certified: Azure Data More ❯
Sterling, Virginia, United States Hybrid / WFH Options
Progression Inc
data modeling and ETL best practices across projects. Troubleshoot and resolve issues in high-volume data processing environments. Experience with real-time data processing and streaming architectures Knowledge of DeltaLake or similar data lakehouse technologies Experience with CI/CD pipelines for data deployments Cloud platform expertise (AWS, Azure, or GCP) Contributions to open-source projects Progression More ❯
and minimize complexity Exceptional interpersonal skills - you communicate clearly with stakeholders as well as other engineers, fostering a collaborative, supportive working environment Experience in the financial markets, especially in delta one, store of value, and/or FICC options trading Experience with Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster … Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. DeltaLake, Apache Iceberg), and relational databases Have a Bachelors or advanced degree in Computer Science, Mathematics, Statistics, Physics, Engineering, or equivalent work experience For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at . More ❯
modern data platform, mentor talented engineers, and make a real impact on strategic decision-making. A successful Lead Azure Data Engineer should have: Strong experience with Databricks, Azure Synapse, DeltaLake, and pipeline orchestration. Hands-on with ETL/ELT processes, data modeling (star schema), SQL, Python, and data governance tools (e.g., Purview, Unity Catalog). Experience implementing More ❯
design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Amtis Professional Ltd
improve data infrastructure Explore AI-driven enhancements to boost data accuracy and productivity Requirements: Strong experience with: Azure Databricks, Data Factory, Blob Storage Python/PySpark SQL Server, Parquet, DeltaLake Deep understanding of: ETL/ELT, CDC, stream processing Lakehouse architecture and data warehousing Scalable pipeline design and database optimisation A proactive mindset, strong problem-solving skills More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
based database technologies like Snowflake, BigQuery, Redshift Experience in open source technologies like Spark, Kafka, Beam Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, DeltaLake, Databricks Experience working in an agile environment Here’s a taste of the perks we roll out for our extraordinary team members: 25 Days of Holiday, plus More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
The Citation Group
based database technologies like Snowflake, BigQuery, Redshift Experience in open source technologies like Spark, Kafka, Beam Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, DeltaLake, Databricks Experience working in an agile environment Here’s a taste of the perks we roll out for our extraordinary team members: 25 Days of Holiday, plus More ❯
facing consulting environment, with the ability to manage stakeholder expectations, navigate complex requirements, and deliver tailored solutions across diverse industries. 5+ years' experience working with Databricks, including Spark and DeltaLake Strong skills in Python and/or Scala for data engineering tasks Comfortable working with cloud platforms like Azure, AWS, and/or Google Cloud A problem More ❯