Surrey, England, United Kingdom Hybrid / WFH Options
Hawksworth
working in the world of Data Science You're more than capable with SQL & Python You have exposure to big data technologies such as Spark Ideally you will have experience with statistical analysis, machine learning algorithms, and data mining techniques You have excellent communication skills and can communicate well more »
genuine passion for renewable energy and sustainability. > Desirable Skills: Experience with time series analysis and forecasting. Familiarity with big data technologies such as Hadoop, Spark, or similar. Knowledge of energy systems, grid management, or related areas. Experience with cloud platforms like AWS, Google Cloud, or Azure. What's on more »
multiple tasks and projects simultaneously. Preferred Qualifications AWS Certified Solutions Architect or other relevant AWS certifications. Experience with big data technologies such as Hadoop, Spark, or similar. Knowledge of data governance and data quality best practices. Familiarity with machine learning and AI concepts and tools. more »
Bricks setup using Terraform experience. * Experience of MLOps and DataOps. * Experience of using container technologies, cloud platforms (ideally AWS), and distributed processing frameworks like Spark and Dask. * Experience in Javascript application development and UI design. * Expertise in developing mobile applications. * Familiarity with the agile software development process. If you more »
Wandsworth, Greater London, Dundonald, United Kingdom
DataBuzz
Bricks setup using Terraform experience. * Experience of MLOps and DataOps. * Experience of using container technologies, cloud platforms (ideally AWS), and distributed processing frameworks like Spark and Dask. * Experience in Javascript application development and UI design. * Expertise in developing mobile applications. * Familiarity with the agile software development process. If you more »
to 10% What Will Help You On The Job Familiarity with running software services at scale AWS Infrastructure, Airflow, Kafka and data streaming using Spark/Scala Understanding of networking fundamentals (OSI layers 2-7) Technical and software engineering background in the areas of cloud computing, enterprise computing, serversand more »
of different platforms. The data will be stored and transported security while still able to be queried efficiently. Technologies used include: Data Technologies: Kafka, Spark, Debezium, GraphQL Programming Languages: Java, Scripting Database Technologies: MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST, Spring Boot, Jersey Build and more »
tasks, both for oneself and others, ensuring progress aligns with project goals. Nice to have: Previous experience in startup environments. Proficiency or experience with Apache Spark. Familiarity or background in working with Azure. Experience orchestrating workflows, particularly within distributed system environments. Knowledge of MLOps principles and practices, especially in more »
and partners Preferred Requirements Experience or strong interest in blockchain and other Web 3.0 technologies Experience with OLAP technologies, such as, Presto/Trino, Spark, Hadoop, Athena, or BigQuery is a plus Experience in Golang or any other strongly-typed programming language Experience mentoring and supporting fellow engineers Our more »
of AI techniques including, graph data analytics, time series, NLP, deep learning, supervised and unsupervised machine learning etc Programming skills in Python or R, Spark and SQL Worked with open source data science libraries and understand how to apply them to various problem types Experience of using the latest more »
as Tableau, Power BI and Sigma. • Experience with programming languages such as Python, R, and/or Julia. • Familiarity with data processing frameworks like Spark or Hadoop is a plus. • Solid understanding of statistical analysis techniques, data mining methods, and machine learning algorithms. • Strong analytical and problem-solving skills more »
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
refining them to strong results. Exposure to Python data science stack Knowledge and working experience of AGILE methodologies. Proficient with SQL. Familiarity with Databricks, Spark and geospatial data/modelling are a plus. We’ll help you gain… Experience working in a high-performance environment where collaboration and business more »
for diverse audiences. Experience with data manipulation, querying, and modeling using SQL databases (e.g., Redshift, PostgreSQL) and familiarity with big data technologies (e.g., Hadoop, Spark) is desirable. Proficiency in query performance tuning and analysis. Experience using Google Analytics creating custom Reports/extract information from GA4. Familiarity with Geospatial more »
etc.) Have experience productionising machine learning models Are an expert in one of predictive modeling, classification, regression, optimisation or recommendation system Have experience with Spark Have knowledge of DevOps technologies such as Docker and Terraform and ML Ops practices and platforms like ML Flow Have experience with agile delivery more »
Products tools (e.g. BigQuery, Dataflow, DataProc, AI Building Blocks, Looker, Cloud Data Fusion, Data prep, etc.) to build solutions for our customers Experience in Spark (Scala/Python/Java) and Kafka. Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including … Cloud Data Fusion, Data prep, etc.) - Construction Industry Sector Preferred Google Cloud Platform Time-Series Data Building Control and Monitoring Systems Java, Scala, Python, Spark, SQL Experience of developing enterprise grade ETL/ELT data pipelines. Deep understanding of data manipulation/wrangling techniques Demonstrable knowledge of applying Data … DB/Neo4j/Elastic, Google Cloud Datastore. Snowflake Data Warehouse/Platform Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc Experience and knowledge of application Containerisation, Docker, Kubernetes more »
Skills & Experience At least 10 years experience working with JavaScript or Python/Java Previous experience deploying Software into the Cloud EKS, Docker, Kubernetes ApacheSpark or NiFi Microservice architecture experience Experience with AI/ML systems more »
Skills & Experience At least 10 years experience working with JavaScript or Python/Java Previous experience deploying Software into the Cloud EKS, Docker, Kubernetes ApacheSpark or NiFi Microservice architecture experience Experience with AI/ML systems more »
as R, Python, Azure, Machine Learning (ML), and Databricks. Essential criteria and experience Proficiency in one or more analytical tools eg R, Python, Tableau, ApacheSpark, etc. Proficiency in Azure Machine Learning and Azure Data Bricks. Pro-activity and self-starting attitude. Excellent analytical and problem-solving ability. more »
data analysis techniques Ability to produce clear graphical representations and data visualisations. Knowledge of data analysis tools, for example, R Programming, Tableau Public, SAS, ApacheSpark, Excel, RapidMiner Knowledge of data modelling, data cleansing, and data enrichment techniques An understanding of data protection issues Experience of working with more »
City of London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard Ltd
Experience with a JVM language, Kotlin, Java, Scala, Clojure Knowledge of Typescript and React is beneficial Exposure to data pipelines using technologies such as Spark and Kafka Experience with cloud services (ideally AWS) Hybrid working 1-2 days per week in Central London. £110,000 depending on experience. Please more »
Python. Develop real-time streaming features using big data tools such as Spark. SKILLS AND EXPERIENCE Extensive experience using big data tools such as Apache Spark. Experience working in and maintaining an AWS database. Strong Python coding background. Good knowledge of working with SQL. THE BENEFITS Generous Holiday plan. more »
products like Data Factory, Event Hubs, Data Lake, Synapse, Azure SQL server. Detailed knowledge in developing in Databricks and experience in coding with PySpark. Spark SQL ETL coding standards: ensuring that code is standardised, self-documenting and can be reliably tested Knowledge of best practice data encryption techniques and more »
design. Cloud data products such as: Data factory Events Hubs Data Lake Synapse Azure SQL Server Experience developing Databricks and coding with PySpark and Spark SQL. Proficient in ETL coding standards Data encryption techniques and standards Knowledge of relevant legislation such as: Data Protection Act, EU Procurement Directives, Freedom more »
Familiarity with data warehousing concepts and technologies, including star and snowflake schema designs. Big Data Technologies: Understanding of big data platforms such as Hadoop, Spark, and tools like Hive, Pig, and HBase. Data Integration: Ability to integrate data from disparate sources using middleware or integration tools as well as more »
data quality issues and enhancing model performance. Creative Solutions : A history of innovative problem-solving and independent solution development. Big Data Technologies : Knowledge of Spark or PySpark for handling large-scale data processing. ML Expertise : A thorough understanding of machine learning techniques and best practices in model development. Desirable more »