and at least 2 years of experience with cloud-based AI/ML technologies (such as tools from AWS, Azure, Google, HuggingFace, OpenAI and Databricks) building ML or applied AI solutions. • A passion for Generative AI, and an understanding of strengths and weaknesses of Generative LLM's • Fundamental knowledge of More ❯
and at least 2 years of experience with cloud-based AI/ML technologies (such as tools from AWS, Azure, Google, HuggingFace, OpenAI and Databricks) building ML or applied AI solutions. • A passion for Generative AI, and an understanding of strengths and weaknesses of Generative LLM's • Fundamental knowledge of More ❯
Experience building scalable, high-quality data models that serve complex business use cases. Knowledge of dbt and experience with cloud data warehouses (BigQuery, Snowflake, Databricks, Azure Synapse etc). Proficiency in building BI dashboards and self-service capabilities using tools like Tableau and Looker. Excellent communication skills and experience in More ❯
demonstrate most or all of the following: Technical Mastery: Expertise in cloud data platforms (AWS, Azure, GCP) and enterprise data solutions (e.g., Microsoft Fabric, Databricks, Snowflake). Proficiency in programming languages like Python, R, or SQL and experience with Power Platform applications. Strong knowledge of generative AI tools, geospatial analytics More ❯
Insurance and Reinsurance Finance GAAP Accounting and Actuarial Reserve Lloyds Regulatory Reports Databases (Data Warehouse, T-SQL, SSIS, SQL Server 2019) ETL Process with Databricks Programming languages (e.g. Python) AWS Skills Agile methodologies and Jira preferrable MS Office products (e.g. Access, Excel) Educated to a science-based BSc degree or More ❯
Position: Data Engineer (Python/Databricks) Location: Remote Salary: up to £80,000 + Bens Are you passionate about health tech and innovation? Do you want to be at the forefront of transforming clinical research with cutting-edge technology? If so, we have an exciting new role for you! Join … integrations. Ensure Data Security: Apply protocols and standards to secure clinical data in-motion and at-rest. Shape Data Workflows: Use your expertise with Databricks components such as Delta Lake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable. Key Responsibilities Data Engineering with … Databricks: Utilize Databricks to design and maintain scalable data infrastructure. Integration with Azure Data Factory: Leverage Azure Data Factory for orchestrating and automating data movement and transformation. Python Development: Write clean, efficient code in Python (3.x), using frameworks like FastAPI and Pydantic. Database Management: Design and manage relational schemas and More ❯
external data integrations.Ensure Data Security: Apply protocols and standards to secure clinical data in-motion and at-rest.Shape Data Workflows: Use your expertise with Databricks components such as Delta Lake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable.Key Responsibilities Data Engineering with Databricks … Utilize Databricks to design and maintain scalable data infrastructure.Integration with Azure Data Factory: Leverage Azure Data Factory for orchestrating and automating data movement and transformation.Python Development: Write clean, efficient code in Python (3.x), using frameworks like FastAPI and Pydantic.Database Management: Design and manage relational schemas and databases, with a strong …/ELT Processes: Develop and optimize data models, ETL/ELT processes, and data lakes to support data analytics and machine learning.Requirements Expertise in Databricks: Proficiency with Databricks components such as Delta Lake, Unity Catalog, and ML Flow.Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration.Clinical Data More ❯
have a strong understanding of various financial products and the trading life cycle. The role Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Knowledge of Kimball data modelling methodology Experience using scripting languages such as Python, PowerShell etc. Experience with Microsoft Azure. Strong knowledge of ETL/ More ❯
have a strong understanding of various financial products and the trading life cycle. The role Advanced proficiency in SQL (T-SQL, PL/SQL, Databricks SQL) Knowledge of Kimball data modelling methodology Experience using scripting languages such as Python, PowerShell etc. Experience with Microsoft Azure. Strong knowledge of ETL/ More ❯
with the expectation of 2-3 days/week onsite as the standard. Must Haves: Data Engineering/PowerBI experience Strong competence with Azure Databricks, SQL, Python, PowerBI, DAX Proven experience leading the design of data models, transformation logic, and build of all Power BI dashboards - including testing, optimization & integration More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Insight Global
with the expectation of 2-3 days/week onsite as the standard. Must Haves: Data Engineering/PowerBI experience Strong competence with Azure Databricks, SQL, Python, PowerBI, DAX Proven experience leading the design of data models, transformation logic, and build of all Power BI dashboards - including testing, optimization & integration More ❯
web and mobile app development, SQL, ETL or data pipelines, and data analysis. You have experience with cloud data warehouses/lakes including Snowflake, Databricks, BigQuery, Redshift, S3, and ADLS. You have experience with AWS, GCP, and/or Azure cloud services. You have strong technical skills and experience with More ❯
teams spread globally. What we value These skills will help you succeed in this role: Full stack cloud developer skills: Data (Delta Lake/Databricks), PL/SQL, Java/J2EE, React, CI/CD pipeline, and release management. Strong experience in Python, Scala/PySpark, PERL/scripting. Experience More ❯
that actually matter Fast-track your career with mentorship from senior experts Gain valuable experience across multiple industries and cloud platforms (Azure, AWS, GCP, Databricks) Competitive salary with genuine development opportunities Currently serving clients in four countries with two live products and two cutting-edge AI solutions in development. This More ❯
WC2A, Holborn and Covent Garden, Greater London, United Kingdom Hybrid / WFH Options
Avanti Recruitment
that actually matter Fast-track your career with mentorship from senior experts Gain valuable experience across multiple industries and cloud platforms (Azure, AWS, GCP, Databricks) Competitive salary with genuine development opportunities Currently serving clients in four countries with two live products and two cutting-edge AI solutions in development. This More ❯
and SQL for accessing and processing data (PostgreSQL preferred but general SQL knowledge is more important). Familiarity with latest Data Science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g. Tensorflow, MXNet, scikit-learn). Knowledge of software engineering practices (coding practices to DS, unit testing, version control, code More ❯
architectures, with a focus on Kafka and Confluent platforms. In-depth knowledge in architecting and implementing data lakes and lakehouse platforms, including experience with Databricks and Unity Catalog. Proficiency in conceptualising and applying Data Mesh and Data Fabric architectural patterns. Experience in developing data product strategies, with a strong inclination More ❯
organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms - demonstrable experience of building and deploying solutions to Cloud (e.g. More ❯
Exciting Opportunity in Health Tech! Position: Data Engineer (Python/Databricks) Location: Remote Salary: up to £80,000 + Bens Reporting To: Vice President of Software Development Are you passionate about health tech and innovation? Do you want to be at the forefront of transforming clinical research with cutting-edge … integrations. Ensure Data Security : Apply protocols and standards to secure clinical data in-motion and at-rest. Shape Data Workflows : Use your expertise with Databricks components such as Delta Lake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable. Key Responsibilities Data Engineering with … Databricks : Utilize Databricks to design and maintain scalable data infrastructure. Integration with Azure Data Factory : Leverage Azure Data Factory for orchestrating and automating data movement and transformation. Python Development : Write clean, efficient code in Python (3.x), using frameworks like FastAPI and Pydantic. Database Management : Design and manage relational schemas and More ❯
technologies can deliver tangible value and generate revenue for the organisation. You will leverage your deep technical expertise in modern data platforms, such as Databricks , Snowflake , and Microsoft Fabric , along with your customer-facing skills to consult, design, and implement advanced data solutions. You will also assist in crafting proposals … data & AI solutions. Conduct technical workshops/basecamp sessions, product demos, and presentations to potential clients, showcasing our expertise in modern data platforms like Databricks, Snowflake, Microsoft Fabric & other cloud native offerings Develop detailed solution architectures, high-level designs, and proof-of-concept prototypes. Contribute to our offshore based Practice … data & AI solution delivery, with a strong focus on pre-sales and customer-facing roles. Proven experience working with modern data platforms such as Databricks , Snowflake , Microsoft Fabric , Azure Synapse , AWS Redshift , or similar technologies. Expertise in cloud data architectures (Lakehouse, Hub & Spoke, Data Mesh etc.), data lakes, and big More ❯
Want to help solve the world's toughest problems with data and AI? This is what we do every day at Databricks. Databricks operates at the leading edge of the Data and AI space. Our customers turn to us to lead the accelerated innovation their businesses need to gain first … ultra-competitive landscape. We are looking for a creative, delivery-oriented Named Enterprise Account Executive to maximise the phenomenal market opportunity that exists for Databricks in South Africa. As an Account Executive, you know how to sell innovation and value to existing customers, identify new use cases and grow consumption … time frames, next steps, and forecasting in Salesforce Identify new use case opportunities and showcase value to existing customers Promote the value of the Databricks' Data Intelligence Platform Orchestrate and utilise our field engineering teams to ensure valuable outcomes for clients Build and demonstrate value with all engagements to guide More ❯
data engineering/BI skills, with a focus on having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Fabric, Azure Databricks or Azure Synapse Proficient with SQL and Python Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Microsoft Fabric More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Peaple Talent
data engineering/BI skills, with a focus on having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Fabric, Azure Databricks or Azure Synapse Proficient with SQL and Python Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Microsoft Fabric More ❯
in a data engineering or data science capacity, ideally within a cloud-based setup. Solid grasp of SQL, ETL workflows, data modelling principles, and Databricks development. Familiarity with DevOps tooling such as Git, CI/CD pipelines, and Azure DevOps. Confident in working with programming languages like SQL and Python More ❯
in a data engineering or data science capacity, ideally within a cloud-based setup. Solid grasp of SQL, ETL workflows, data modelling principles, and Databricks development. Familiarity with DevOps tooling such as Git, CI/CD pipelines, and Azure DevOps. Confident in working with programming languages like SQL and Python More ❯