of Python Experience developing in the cloud (AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex More ❯
of Python Experience developing in the cloud (AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex More ❯
marketing optimisation. Proficiency in Python, SQL, Bash, and Git, with hands-on experience in Jupyter notebooks, Pandas, and PyTorch. Familiarity with cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong problem-solving skills and a passion for driving measurable business impact. Knowledge of marketing measurement techniques like More ❯
Write clean, efficient, and reusable code with clear documentation Required Experience: Strong expertise in Microsoft SQL or T-SQL for data querying Experience with Databricks, Azure, and DevOps Background in data warehousing and development Hands-on experience with database modeling, documentation, and ETL processes 7+ years experience in the above More ❯
models and dashboards, ideally in Power BI . Software development methodologies (Sprints/Agile) and project management software (Jira Software). SQL Server Database Databricks (or Alternative Modern Data Platform such as Snowflake) Knowledge of Orchestration Tools and processes (e.g SSIS , ODI, Informatica, Data Factory) Power BI Development including the More ❯
london, south east england, united kingdom Hybrid / WFH Options
VirtueTech Recruitment Group
models and dashboards, ideally in Power BI . Software development methodologies (Sprints/Agile) and project management software (Jira Software). SQL Server Database Databricks (or Alternative Modern Data Platform such as Snowflake) Knowledge of Orchestration Tools and processes (e.g SSIS , ODI, Informatica, Data Factory) Power BI Development including the More ❯
Elasticsearch, Logstash, Kibana, and Beats), you will also collaborate within a broader Data Science function engaged with advanced technologies like artificial intelligence, machine learning, Databricks, Node.js, and GraphQL. Join a collaborative engineering environment where you'll work alongside multiple development teams to ensure data streams are accurate, reliable, and effectively More ❯
in Marketing & Data Storytelling. This remote role offers an exciting opportunity to work with cutting-edge technologies such as Athena/Presto, Python (notebooks), Databricks, Google Analytics, and top BI tools like Looker and Tableau. About Constructor Constructor is the only search and product discovery platform tailor-made for enterprise More ❯
ML solution architecture with large-scale, complex environments. Proven experience with Cloud platforms, such as Azure or AWS and Data Platforms such as Azure Databricks Exposure togenerative AI and foundation models (LLMs) Capability to develop presentations for and present to executive and senior level committees Strong Influencing and communication skills More ❯
Keen interest in some of the following areas: Big Data Analytics (e.g. Google BigQuery/BigTable, Apache Spark), Parallel Computing (e.g. Apache Spark, Kubernetes, Databricks), Cloud Engineering (AWS, GCP, Azure), Spatial Query Optimisation, Data Storytelling with (Jupyter) Notebooks, Graph Computing, Microservices Architectures Modelling & Statistical Analysis experience, ideally customer related A More ❯
Keen interest in some of the following areas: Big Data Analytics (e.g. Google BigQuery/BigTable, Apache Spark), Parallel Computing (e.g. Apache Spark, Kubernetes, Databricks), Cloud Engineering (AWS, GCP, Azure), Spatial Query Optimisation, Data Storytelling with (Jupyter) Notebooks, Graph Computing, Microservices Architectures. A university degree - numbers based, Computer Science or More ❯
Keen interest in some of the following areas: Big Data Analytics (e.g. Google BigQuery/BigTable, Apache Spark), Parallel Computing (e.g. Apache Spark, Kubernetes, Databricks), Cloud Engineering (AWS, GCP, Azure), Spatial Query Optimisation, Data Storytelling with (Jupyter) Notebooks, Graph Computing, Microservices Architectures A university degree - numbers based, Computer Science or More ❯
data ingestion, data storage, data serving, APIs, etc.). Hands-on experience in data lake or data warehouse and related technologies (e.g. Spark, ETL, Databricks). Hands-on applied experience delivering system design, application development, testing, and operational stability Advanced in one or more programming languages, such as Java or More ❯
next era of innovation. Over the past year, we have rebuilt our data infrastructure from the ground up, leveraging the latest advancements in Azure Databricks, Microsoft Fabric, and OneLake . Right now, our Analytics team is migrating all analytic solutions from our legacy AWS platform onto this powerful new stack. More ❯
queries for huge datasets. Has a solid understanding of blockchain ecosystem elements like DeFi, Exchanges, Wallets, Smart Contracts, mixers and privacy services. Bonus Experience: Databricks and PySpark Analysing blockchain data Building and maintaining data pipelines Deploying machine learning models Use of graph analytics and graph neural networks Following funds on More ❯
technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Knowledge of AWS Knowledge of Databricks Understanding of Cloudera Hadoop, Spark, HDFS, HBase, Hive. Understanding of Maven or Gradle About the Team J.P. Morgan is a global leader in financial services More ❯
you:Bachelor\\\'s degree in computer science, engineering, or related fieldCreative mind-set, ability to think out side the boxKnowledge of/experience using Databricks/Azure Data Factory/SQL/PythonThe ability to communicate with diverse stakeholder groups, including senior management, and demonstrated ability to influence individuals beyond More ❯
identify creative solutions to deliver outcomes in the face of obstacles. Knowledge of common data science tools around SQL-based data warehousing (e.g., Snowflake, Databricks, DBT), BI tools (e.g., Tableau, Looker), workflow orchestration, and ML Ops. Excellent spoken and written English skills. Fluency with scripting in Python . Ability to More ❯
Internet facing web applications Experience of data integration and streaming data analytics pipeline technologies Experience of cloud big data analytics platforms and technologies (e.g. Databricks ) Certified Kubernetes knowledge (e.g. CKAD) Certified AWS developer knowledge (e.g. AWS Developer Associate) About the Team J.P. Morgan is a global leader in financial services More ❯
Hook, Hampshire, United Kingdom Hybrid / WFH Options
360 Resourcing Solutions
team management skills to lead, motivate and grow a diverse team who are geographically dispersed. Comfortable using a multi-platform stack to deliver products (Databricks, Azure ML Studio, Azure and Snowflake, SageMaker, AWS). Ability to take ownership of a broad data product-set and drive forward the delivery of More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
and solutions, both for today and for a new era of building. To support our progress, we are currently recruiting for a Data Engineer (Databricks & BI) to come and join our team at Ibstock Head Office, LE67 6HS. (Applicants must have Right to work in the UK - We are unable … Data Engineer/BI Developer will play a critical role in developing and refining our modern data lakehouse platform, with a primary focus on Databricks for data engineering, transformation, and analytics. The role will involve designing, developing, and maintaining scalable data solutions to support business intelligence and reporting needs. This … platform integrates Databricks, Power BI, and on-premises JDE systems to provide near real-time insights for decision-making. Key Accountabilities: Ensure the existing design, development, and expansion of a near real-time data platform integrating AWS, Databricks and on-prem JDE systems. Develop ETL processes to integrate and transform More ❯
efficiency and maximize business value by confidently utilising trustworthy data. What are we looking for? 5+ years as a Data Engineer Experience with Spark, Databricks, or similar data processing tools. Proficiency in working with the cloud environment and various software including SQL Server, Hadoop, and NoSQL databases. Proficiency in Python … to new technologies and languages. Expertise in designing and building Big Data databases, analytics, and BI platforms. Strong understanding and experience in working with Databricks Delta Lake. Keen interest in the latest trends and tools in data engineering and analytics. Familiarity with graph databases (e.g., Neo4J/Cypher). Experience … maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS More ❯
efficiency and maximize business value by confidently utilising trustworthy data. What are we looking for? 5+ years as a Data Engineer Experience with Spark, Databricks, or similar data processing tools. Proficiency in working with the cloud environment and various software including SQL Server, Hadoop, and NoSQL databases. Proficiency in Python … to new technologies and languages. Expertise in designing and building Big Data databases, analytics, and BI platforms. Strong understanding and experience in working with Databricks Delta Lake. Keen interest in the latest trends and tools in data engineering and analytics. Familiarity with graph databases (e.g., Neo4J/Cypher). Experience … maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS More ❯
This is a remote role and can be based anywhere in Switzerland, the UK or Germany Databricks operates at the leading edge of the Unified Data Analytics and AI space. Our customers turn to us to lead the accelerated innovation that their businesses need to gain first mover advantage in … of business in the Czech Republic, Romania & the wider CEE region. Reporting to our Director of Enterprise Sales, as an Enterprise Account Executive at Databricks you will focus on enhancing established commercial relationships and closing new business opportunities within a small, targeted portfolio of existing, enterprise-class clients in the … strategy and managing it to success Understanding of how to identify important uses cases and buying centres in order to increase the impact of Databricks within an organisation Fluent (C1 or C2) English and Czech or Romanian language skills essential About DatabricksDatabricks is the data and AI company. More More ❯
and optimize data models, warehouse solutions, and ETL processes. Work with Scala, Spark, and Java to handle large-scale data processing. Contribute to manual Databricks-like data processing solutions. Requirements: Minimum of 4 years of experience with Scala, Spark, and Java. Strong technical skills and a passion for working with More ❯