technical skills, knowledge and professional qualifications provided: Databases – has a deep understanding of relational databases Use DevOps methods and work in an Agile environment Programming - expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows Has expertise in SQL AI and Machine Learning knowledge as it applies to end More ❯
london (city of london), south east england, united kingdom
Blenheim
technical skills, knowledge and professional qualifications provided: Databases – has a deep understanding of relational databases Use DevOps methods and work in an Agile environment Programming - expertise in Python, Java, Scala, or other programming languages used to build data pipelines, implement data transformations, and automate data workflows Has expertise in SQL AI and Machine Learning knowledge as it applies to end More ❯
knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to More ❯
Employment Type: Permanent
Salary: £80000 - £95000/annum Attractive Bonus and Benefits
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to More ❯
business to deliver value-driven solutions What were looking for: London/Lloyd's Market experience is essential Strong programming skills in Python and SQL; knowledge of Java or Scala is a plus Solid experience with relational databases and data modelling (Data Vault, Dimensional) Proficiency with ETL tools and cloud platforms (AWS, Azure or GCP) Experience working in Agile and More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to More ❯
to their growth and development Apply agile methodologies (Scrum, pair programming, etc.) to deliver value iteratively Essential Skills & Experience Extensive hands-on experience with programming languages such as Python, Scala, Spark, and SQL Strong background in building and maintaining data pipelines and infrastructure In-depth knowledge of cloud platforms and native cloud services (e.g., AWS, Azure, or GCP) Familiarity with More ❯
flow issues, optimize performance, and implement error handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
of AWS services including S3, Redshift, EMR, Kinesis and RDS. - Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) - Ability to write code in Python, Ruby, Scala or other platform-related Big data technology - Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code reviews, source control management, build More ❯
knowledge sharing across the team. What We're Looking For Strong hands-on experience across AWS Glue, Lambda, Step Functions, RDS, Redshift, and Boto3. Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building Real Time event streaming pipelines (eg, Kafka, Spark Streaming, Kinesis). Proven More ❯
with delivery. You bring: Proven experience in data engineering at scale , including solution design, optimisation, and successful delivery. Strong coding skills in at least two languages: Python, PySpark, Java, Scala, or Spark . Hands-on expertise with Azure (ADF, Databricks, Data Lake, Delta Lake, or Lakehouse) and Big Data technologies (HDFS/Hadoop, Cloudera, etc). Advanced SQL skills and More ❯
ll need: Demonstrated experience as a Senior Data Engineer in complex enterprise environments Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large More ❯
ll need: Demonstrated experience as a Senior Data Engineer in complex enterprise environments Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large More ❯
london (city of london), south east england, united kingdom
Sahaj Software
ll need: Demonstrated experience as a Senior Data Engineer in complex enterprise environments Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large More ❯
version control systems like Git, enabling effective collaboration and code management. Experience in an ML engineer or data scientist role building ML models. Experience writing code in Python, R, Scala, Java, C++ with documentation for reproducibility. Experience using Apache Spark/Databricks distributed compute environments for AI/ML workloads. Experience handling petabyte size datasets, diving into data to discover More ❯
practice. We are looking for experience in the following skills: Relevant work experience in data science, machine learning, and business analytics Practical experience in coding languages eg. Python, R, Scala, etc.; (Python preferred) Proficiency in database technologies eg. SQL, ETL, No-SQL, DW, and Big Data technologies e.g. pySpark, Hive, etc. Experienced working with structured and also unstructured data eg. More ❯
platforms (Azure) and data engineering best practices . Advanced proficiency in Power BI , including DAX, Power Query, and data modeling. Strong programming skills in Python, SQL, and/or Scala for data processing and automation. Experience with ETL/ELT, data warehousing, and event-driven architectures . Knowledge of AI/ML applications in data analytics and business intelligence. Proven More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Searchability
as a Data Engineer, with Python & SQL expertise Familiarity with AWS services (or equivalent cloud platforms) Experience with large-scale datasets and ETL pipeline development Knowledge of Apache Spark (Scala or Python) beneficial Understanding of agile development practices, CI/CD, and automated testing Strong problem-solving and analytical skills Positive team player with excellent communication abilities TO BE CONSIDERED More ❯
web/mobile applications or platform with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. " Exposure to polyglot programming languages like Scala, Python and Golang will be a plus " Ability to read/write code and expertise with various design patterns " Have used NoSQL database such as MongoDB, Cassandra, etc. " Work on More ❯
platforms (e.g., Hadoop, Spark, Kafka) and multi-cloud environments (AWS, Azure, GCP) Experience with machine learning frameworks (e.g., TensorFlow, Scikit-learn, PyTorch) Strong programming skills in Python, Java, or Scala Familiarity with data pipeline development and real-time data processing Proven ability to design and implement data-intensive solutions at scale Experience supporting analytics for the intelligence or defense community More ❯
predictive analyses and machine learning models. Preferred Experience: Preferred: Experience with cloud computing/storage, data lakes/warehouses. Preferred: Experience with Spark and ML applications. Preferred: Experience with Scala, Java Preferred: Experience with Robotic Process Automation (RPA) (e.g., UiPath) Preferred: Experience with Natural Language Processing (NLP) Preferred: Experience in Databricks More ❯
sets through the use of scripts or algorithms Required Skill Sets: • E xperience programming or scripting and debugging in one or more languages such as: Python, Javascript, R, SQL, Scala, etc. • Proficiency with data mining, mathematics, and statistical analysis demonstrated with hands-on academic and project experience conducting statistical analysis, testing, and modeling using regression analysis, linear regression, predictive modeling More ❯
of experience in DevOps/Data Engineering Extensive familiarity with AWS services to include CloudFormation, EC2, S3, and RDS CloudFormation, Ansible, Git, Jenkins, Bash Programming Languages: Python, Java or Scala Processing Tools: Elasticsearch, Spark, NiFi, and/or Docker Datastore Types: Graph, NoSQL, and/or Relational US Citizenship and an active TS/SCI with Polygraph security clearance required More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
. Proficiency in performance testing tools like JMeter, Gatling, K6, Neoload, or Webload . Strong coding skills in at least one language such as Java, TypeScript JavaScript, Python, C#, , Scala, or PHP. Experience designing and building automation frameworks . Familiarity with Agile development environments (SCRUM, Kanban, TDD, BDD). Implementing pipelines using common tooling such as Jenkins, ADO, GitHub actions More ❯