scenarios that can benefit from AI solutions (Machine Learning and Gen AI). Explore and analyse data from various sources and formats using tools such as Microsoft Fabric, Azure Databricks, Azure Synapse Analytics, and Azure Machine Learning Implement data pipelines and workflows to automate and operationalize machine learning solutions using tools such as Azure ML Pipelines, Azure DevOps. Run experiments … workflows and deploying them at scale using Azure services. Familiarity with data integration tools like Azure Data Factory and Data platform solutions such as Microsoft Fabric and/or Databricks Excellent communication, collaboration, stakeholder management, and problem-solving skills. Familiarity with the Microsoft Copilot stack. Microsoft Certified: Azure Data Scientist Associate certification or AI Engineer is a plus Experience within More ❯
scenarios that can benefit from AI solutions (Machine Learning and Gen AI). Explore and analyse data from various sources and formats using tools such as Microsoft Fabric, Azure Databricks, Azure Synapse Analytics, and Azure Machine Learning Implement data pipelines and workflows to automate and operationalize machine learning solutions using tools such as Azure ML Pipelines, Azure DevOps. Run experiments … workflows and deploying them at scale using Azure services. Familiarity with data integration tools like Azure Data Factory and Data platform solutions such as Microsoft Fabric and/or Databricks Excellent communication, collaboration, stakeholder management, and problem-solving skills. Familiarity with the Microsoft Copilot stack. Microsoft Certified: Azure Data Scientist Associate certification or AI Engineer is a plus Experience within More ❯
Python scripts for efficient data processing. Implementing data modeling techniques to ensure scalable and maintainable data architectures. Working with cloud platforms such as Azure or AWS, leveraging tools like Databricks, Data Factory, and Synapse Analytics. Ensuring best practices in DevOps, version control, and data governance. Managing orchestration workflows using Airflow or similar tools. Supporting AI teams by preparing high-quality … experience in Data Engineering. Experience in client-facing roles, in consulting companies. Technical Skills: Strong expertise in SQL (especially T-SQL) and Python. Experience in Azure with tools like Databricks, Data Factory, and Synapse Analytics. Knowledge of data modeling techniques. Familiarity with DevOps and version control best practices. Experience with Airflow or other orchestration tools is a plus. Expertise in More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
and users about system issues and solutions. Demonstrated attention to detail in documentation, testing, and workflow processes. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: DOD 8570 IAT Level II Certification may be required (GSEC, GICSP, CND, CySA+, Security+ CE, SSCP or CCNA-Security). Familiarity with Apache … Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity More ❯
audiences. Problem-solving skills with a business focus and stakeholder management experience. Knowledge of change management and systems processes. Comfortable with cloud platforms and Microsoft stack tools like Synapse, Databricks, Azure Data Factory, and Azure DevOps. Programming experience in Python, SQL, T-SQL, and SSIS. Understanding of ELT/ETL processes, RESTful APIs, and integration concepts. What’s in it More ❯
audiences. Problem-solving skills with a business focus and stakeholder management experience. Knowledge of change management and systems processes. Comfortable with cloud platforms and Microsoft stack tools like Synapse, Databricks, Azure Data Factory, and Azure DevOps. Programming experience in Python, SQL, T-SQL, and SSIS. Understanding of ELT/ETL processes, RESTful APIs, and integration concepts. What’s in it More ❯
Snowflake cloud data warehouse. Proficiency in writing and optimising Python code, especially within Snowflake stored procedures. Familiarity with modern cloud platforms for data integration, such as Azure Data Factory, Databricks, or similar tools. Experience in Power BI, including data modelling, DAX, and managing semantic models. Good understanding of data warehousing, dimensional modelling, and ELT best practices. Knowledge of version control More ❯
of Terraform is beneficial but not required. Excellent problem-solving and communication skills. Preferred Qualifications Experience with additional Azure services like Azure Data Lake , Azure SQL Database , or Azure Databricks . Familiarity with Infrastructure-as-Code (IaC) practices. Strong knowledge of data governance and security best practices in the cloud. Previous experience in a DevOps environment with CI/CD More ❯
learning and Data science applications Ability to use wide variety of open-source technologies Knowledge and experience using at least one Data Platform Technology such as Quantexa, Palantir and DataBricks Knowledge of test automation frameworks and ability to automate testing within the pipeline To discuss this or wider Technology roles with our recruitment team, all you need to do is More ❯
relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the More ❯
relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Aventum Group
change management and systems processes Strong interpersonal and presentation skills with the ability to influence others Skills and Abilities Platforms & Tools Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Azure Service Bus, Power BI, DeltaLake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Azure ML is a plus Languages: Python More ❯
About Exsolvæ: Exsolvæ is a Brussels-based consultancy at the forefront of Data and Artificial Intelligence innovation. We specialize in providing expert Consultancy, Audit, and Solution Development services to empower our partners in overcoming unique challenges. Our holistic approach fosters More ❯
Data Architect/Senior Data Architect £600 - £650 P.D - Outside IR35 6 Month contract 1 day a week onsite in central London Overview: We're working with a Data Architect to provide expert-level consulting services over a 6-month More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
CipherTek Recruitment
greenfield MLOps pipelines that handle very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will shape the MLOps framework and … data sources (orders, quotes, trades, risk, etc.). Essential Requirements: 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks … CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and integration. Experience with Agile methodologies and SDLC practices. Strong problem-solving, analytical, and communication skills. More ❯
Python (PySpark) . Ingest, transform, and curate data from multiple sources into Azure Data Lake and Delta Lake formats. Build and optimize datasets for performance and reliability in Azure Databricks . Collaborate with analysts and business stakeholders to translate data requirements into robust technical solutions. Implement and maintain ETL/ELT pipelines using Azure Data Factory or Synapse Pipelines . … Monitor and troubleshoot production jobs and processes. Preferred Skills & Experience: Strong proficiency in SQL for data transformation and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensional modelling , and data architecture . Experience working with Delta Lake and large-scale More ❯
design. Experience architecting and building data applications using Azure, specifically a Data Warehouse and/or Data Lake. Technologies : Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, Azure Databricks and Power BI. Experience with creating low-level designs for data platform implementations. ETL pipeline development for the integration with data sources and data transformations including the creation of supplementary … with APIs and integrating them into data pipelines. Strong programming skills in Python. Experience of data wrangling such as cleansing, quality enforcement and curation e.g. using Azure Synapse notebooks, Databricks, etc. Experience of data modelling to describe the data landscape, entities and relationships. Experience with data migration from legacy systems to the cloud. Experience with Infrastructure as Code (IaC) particularly More ❯
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and … solutions. Job Accountabilities Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing … workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez More ❯
data quality solutions. The ideal candidate should have strong expertise in ETL framework testing (preferably Talend or DataStage), BI report testing (preferably Power BI, Cognos), cloud technologies (preferably Azure, Databricks), SQL/PLSQL coding, and Unix/Python scripting. Key Responsibilities Lead and mentor a team of test engineers, assisting them with technical challenges and guiding them on best testing … frameworks. Understanding of DataOps & TestOps concepts for continuous data quality testing and automation. Experience validating unstructured data formats, including XML, JSON, Parquet. Knowledge of cloud data platforms like Azure, Databricks for data processing and analytics. In addition to our open door policy and supportive work environment, we also strive to provide a competitive salary and benefits package. TJX considers all More ❯
ll play a key role in shaping the data strategy, enhancing platform capabilities, and supporting business intelligence initiatives. Key Responsibilities Design and develop Azure data pipelines using Data Factory, Databricks, and related services. Implement and optimize ETL processes for performance, reliability, and cost-efficiency. Build scalable data models and support analytics and reporting needs. Design and maintain Azure-based data More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Amtis - Digital, Technology, Transformation
ll play a key role in shaping the data strategy, enhancing platform capabilities, and supporting business intelligence initiatives. Key Responsibilities Design and develop Azure data pipelines using Data Factory, Databricks, and related services. Implement and optimize ETL processes for performance, reliability, and cost-efficiency. Build scalable data models and support analytics and reporting needs. Design and maintain Azure-based data More ❯
government services Your knowledge and certifications: Any MS Azure data certifications 2 years + working with Azure data engineering tools e.g: Azure Data factory Azure Synapse Azure SQL Azure DataBricks Microsoft Fabric Azure data lake Exposure to other data engineering and storage tools: Snowflake AWS tools – Kinesis/Glue/Redshift Google tools – BigQuery/Looker Experience working with open More ❯
data-heavy systems, ideally in a startup or fast-moving environment. Technical Stack : Languages/Tools: Python (REST API integrations), DBT, Airbyte, GitHub Actions Modern Data Warehousing: Snowflake, Redshift, Databricks, or BigQuery. Cloud & Infra: AWS (ECS, S3, Step Functions), Docker (Kubernetes or Fargate a bonus) Data Modelling : Strong grasp of transforming structured/unstructured data into usable models (facts, dimensions More ❯
Data Engineer – Databricks About the Role We’re looking for a Databricks Champion to design, build, and optimize data pipelines using Databricks. You’ll work with clients and internal teams to deliver scalable, efficient data solutions tailored to business needs. Key Responsibilities Develop ETL/ELT pipelines with Databricks and Delta Lake Integrate and process data from diverse sources Collaborate … with data scientists, architects, and analysts Optimize performance and manage Databricks clusters Build cloud-native solutions (Azure preferred, AWS/GCP also welcome) Implement data governance and quality best practices Automate workflows and maintain CI/CD pipelines Document architecture and processes What We’re Looking For Required: 5+ years in data engineering with hands-on Databricks experience Databricks Champion … Status (Solution Architect/Partner) Proficient in Databricks, Delta Lake, Spark, Python, SQL Cloud experience (Azure preferred, AWS/GCP a plus) Strong problem-solving and communication skills Databricks Champion More ❯
Passionate about Databricks and Data Engineering? Aivix has been in operation for over a year and recently achieved AWS Select Partner & Databricks Registered partner status. We excel in offering analytics solutions that facilitate data-driven digital transformations for our customers. Aivix is currently looking for data engineers who are interested in Databricks. You will be shaping the beating heart of … we are an enthusiastic group. We encourage everyone to be themselves while actively empowering an entrepreneurial mindset. Ready to contribute to help and build our growing story? As a Databricks Engineer you will be involved in: Making use of best practices in Databricks; Big Data and AI projects Having conversations with the customer to correctly identify the data needs; Developing More ❯