large-scale datasets. Highly proficient use of tools, techniques, and manipulation including Cloud platforms, programming languages, and a full understanding of modern engineering practices. Proficiency in tools like AWS, Databricks, Snowflake, Ab Initio, Terraform etc. Passion for automation, optimization and delivering high-quality data solutions. The ability to deliver work at a steady, predictable pace to achieve commitments, deliver complete … with some of the following tools & platforms (or similar): AWS (S3, Lambda, Kinesis, API Gateway, IAM, Glue, SNS, SQS, EventBridge, EKS, VPC, Step Functions, ECS/EKS, DynamoDB, etc.), Databricks, Python, JavaScript, Kafka, dbt, Terraform, Snowflake, SQL, Jenkins, GitHub, Airflow, Alation, Secrets Management (HashiCorp Vault, AWS Secrets Manager, or similar), Docker/OpenShift/Open Cloud Foundry, MongoDB, SonarQube Knowledge More ❯
Come join an exciting new project utilizing cutting edge technologies in Chantilly, Virginia. Skill-sets: Python, Databricks, LLM (Large Language Model), NLP (Natural Language Processing), model tuning, containerization (Docker/Kubernetes), model validation/testing, prompt engineering, scaling/optimization techniques, software development, and Apache Spark. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Arc IT Recruitment
looking to blend strategic leadership with hands-on technical skills. The role and skill sets... Ownership & Operation: Lead and operate core AWS services, CI/CD DevOps services, and Databricks data platform services. Scale-Up Experience: Been there, done that in a 200-2000+ sized org during their growth journey. Technically Proficient: Comfortable with large infrastructure changes using Pulumi More ❯
Vault 2.0 methodologies Experience designing models across raw, business, and consumption layers Solid understanding of metadata, cataloguing, and data governance practices Working knowledge of modern data platforms such as Databricks, Azure, Delta Lake, etc. Excellent communication and stakeholder engagement skills Data Architect - Benefits: Competitive base salary with regular reviews Car allowance - circa £5k per annum Discretionary company bonus Enhanced pension More ❯
EASA regulations, and strict confidentiality protocols. Strong IT skills, ideally with experience in easyJet systems (Safetynet/AIMS), remote working tools (MS Teams, SharePoint), and data analysis platforms (Tableau, Databricks, Python). Excellent communication, interpersonal, and presentation skills; professional and methodical approach to analysis. Ability to work closely with the Flight Data Manager to ensure the highest standards of integrity More ❯
of creating thought leadership content and customer-facing collateral. Commercial acumen with experience in costing and pricing solutions. Technical Expertise Strong knowledge of modern data platforms (e.g., Microsoft Fabric, Databricks, Snowflake, AWS). Familiarity with data governance, data strategy, and enterprise data foundations . Technical proficiency at Solution/Architect level (Level 2–3). Soft Skills Exceptional communication skills More ❯
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
london (city of london), south east england, united kingdom
Capgemini
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
the support team to maintain, enhance, and ensure the reliability of our BI systems hosted on Microsoft Azure. Optimize and manage Azure data services, including Azure Data Factory, Azure Databricks, and Azure SQL Database, as well as Power BI for data analysis and visualization. Monitor and troubleshoot data pipelines to ensure seamless and efficient operations. Stay updated with advancements in … fast-paced, dynamic environment. Being open minded, motivated, and self-organized. Nice to have : Hands on experience with Cloud platform. Microsoft Azure is preferable. Particularly Azure Data Factory, Azure Databricks, Azure SQL Database, Azure Synapse Analytics, and Azure Data Lake Storage. Familiarity with programming languages such as Python, Scala, Java, C#. Bachelor's or Master's degree in computer science More ❯
Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, Apache Airflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and systems to ensure team alignment and … or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Lakehouse/Delta Lake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/ML/Machine Learning/AI/Artificial Intelligence/Based in the West Midlands/Solihull/Birmingham area, Permanent role, £50,000 70,000 + car/allowance + bonus. One of our … Permanent role Salary £50,000 70,000 + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks ML/Machine Learning/AI/Artificial Intelligence Microsoft SQL Server Lakehouse, Delta Lake Data Warehousing ETL Database Design Python/PySpark Azure Blob Storage Azure Data Factory More ❯
the ability to engage both technical and non-technical audiences. Professional certifications (e.g. DAMA CDMP, DCAM) are desirable. Hands-on knowledge of metadata/cataloguing tools (e.g. Azure Purview. Databricks). Benefits for the Data Governance Manager Hybrid working 25 days holiday + bank holidays increasing with service Discretionary annual bonus Enhanced pension scheme Healthcare cash plan Private health insurance More ❯
Employment Type: Permanent
Salary: £65000 - £80000/annum 25+bank, bonus + more
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Computerworld Personnel Ltd
the ability to engage both technical and non-technical audiences. Professional certifications (e.g. DAMA CDMP, DCAM) are desirable. Hands-on knowledge of metadata/cataloguing tools (e.g. Azure Purview, Databricks). Benefits for the Data Governance Manager Hybrid working 25 days holiday + bank holidays increasing with service Discretionary annual bonus Enhanced pension scheme Healthcare cash plan Private health insurance More ❯
Database management systems: PostgreSQL, ClickHouse Deployment tools: Flux, Helm, Kustomize Frontend frameworks: React, Angular Infrastructure as code: Terraform, Terragrunt Cloud provider: AWS Event streaming platform: Kafka Big data processing: Databricks About Chainalysis Blockchain technology is powering a growing wave of innovation. Businesses and governments around the world are using blockchains to make banking more efficient, connect with their customers, and More ❯
business needs our type of product, you'll work with a variety of new clients and industries as Zip scales. Current clients include OpenAI, Coinbase, Snowflake, Notion, Canva, Samsara, Databricks, etc. You Will Lead onboarding for new customers, with a heavy emphasis on understanding requirements and creatively configuring the product to solve their problems Responsible for leading the end-to More ❯
ll play a key role in designing and maintaining data pipelines that make business insights possible. You'll work with a range of Microsoft Azure tools including Data Factory, Databricks, and SQL Server, as well as develop dashboards and KPIs using Power BI. Day-to-day, you'll: Support the design, build, and maintenance of data pipelines. Assist with data … Experience (preferred, but not essential): Experience in a technical IT role. Basic understanding of databases and SQL. Exposure to Python or another programming language. Familiarity with Azure Data Factory, Databricks, or other data tools. Personal qualities: Strong written and verbal communication skills. A collaborative team player with good interpersonal skills. Analytical, detail-oriented, and proactive in problem solving. Adaptable and More ❯
ll play a key role in designing and maintaining data pipelines that make business insights possible. You'll work with a range of Microsoft Azure tools including Data Factory, Databricks, and SQL Server, as well as develop dashboards and KPIs using Power BI. Day-to-day, you'll: Support the design, build, and maintenance of data pipelines. Assist with data … Experience (preferred, but not essential): Experience in a technical IT role. Basic understanding of databases and SQL. Exposure to Python or another programming language. Familiarity with Azure Data Factory, Databricks, or other data tools. Personal qualities: Strong written and verbal communication skills. A collaborative team player with good interpersonal skills. Analytical, detail-oriented, and proactive in problem solving. Adaptable and More ❯
this is a senior role and not suitable for junior candidates ). Azure Expertise: Deep hands-on knowledge of Azure data and AI services - including Azure Machine Learning, Azure Databricks, Azure Data Lake/Synapse, Azure Cognitive Services (Text, Vision, Speech), Azure OpenAI and Azure AI Foundry. Ability to architect solutions that leverage these services cohesively. Architectural Skills: Strong skills … expertise in Azure and AI. Examples include Microsoft Certified: Azure Solutions Architect Expert and Azure AI Engineer Associate (AI-102). Certification in machine learning frameworks or platforms (e.g. Databricks Certified Generative AI Engineer Associate ) is also valued. Devoteam supports continuous learning and certification attainment. Team You will be part of a collaborative, remote-friendly team that values continuous learning More ❯
this is a senior role and not suitable for junior candidates ). Azure Expertise: Deep hands-on knowledge of Azure data and AI services - including Azure Machine Learning, Azure Databricks, Azure Data Lake/Synapse, Azure Cognitive Services (Text, Vision, Speech), Azure OpenAI and Azure AI Foundry. Ability to architect solutions that leverage these services cohesively. Architectural Skills: Strong skills … expertise in Azure and AI. Examples include Microsoft Certified: Azure Solutions Architect Expert and Azure AI Engineer Associate (AI-102). Certification in machine learning frameworks or platforms (e.g. Databricks Certified Generative AI Engineer Associate ) is also valued. Devoteam supports continuous learning and certification attainment. Team You will be part of a collaborative, remote-friendly team that values continuous learning More ❯
this is a senior role and not suitable for junior candidates ). Azure Expertise: Deep hands-on knowledge of Azure data and AI services - including Azure Machine Learning, Azure Databricks, Azure Data Lake/Synapse, Azure Cognitive Services (Text, Vision, Speech), Azure OpenAI and Azure AI Foundry. Ability to architect solutions that leverage these services cohesively. Architectural Skills: Strong skills … expertise in Azure and AI. Examples include Microsoft Certified: Azure Solutions Architect Expert and Azure AI Engineer Associate (AI-102). Certification in machine learning frameworks or platforms (e.g. Databricks Certified Generative AI Engineer Associate ) is also valued. Devoteam supports continuous learning and certification attainment. Team You will be part of a collaborative, remote-friendly team that values continuous learning More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom Hybrid / WFH Options
Littlefish
Fabric, including OneLake, Dataflows, Notebooks, and Power BI.? Strong understanding of data architecture, data modeling, and ETL/ELT processes.? Experience with Azure Data Services (e.g., Synapse, Data Factory, Databricks) is a plus.? Demonstrates an understanding of enterprise BI and Data Warehousing tools, concepts and methods? Experience in pre-sales or client-facing consulting roles.? Ability to manage multiple projects … and priorities in a fast-paced environment.? Experience with SSRS/SSAS (Tabular with DAX & OLAP with MDX)/SSIS? Experience with Databricks, Pyspark, and other data sciences tools Experience of using Azure DevOps/Git? Microsoft Certified on Data (e.g. Fabric) This is a client-facing role, so the following are essential Excellent communication and presentation skills, with the More ❯
Position Summary The Database Engineer will support the ESCAPE program and broader enterprise data initiatives by designing, implementing, and maintaining scalable, secure, and high-performance database systems. This role ensures data integrity, supports analytics and reporting, and aligns with DoD More ❯