frameworks and practices. Understanding of machine learning workflows and how to support them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks More ❯
services (AWS, GCP, or Azure) Understanding of data modeling, distributed systems, ETL/ELT pipelines, and streaming architectures Proficiency in SQL and at least one programming language (e.g., Python, Scala, or Java) Demonstrated experience owning complex technical systems end-to-end, from design through production Excellent communication skills with the ability to explain technical concepts to both technical and non More ❯
business to deliver value-driven solutions What were looking for: London/Lloyd's Market experience is essential Strong programming skills in Python and SQL; knowledge of Java or Scala is a plus Solid experience with relational databases and data modelling (Data Vault, Dimensional) Proficiency with ETL tools and cloud platforms (AWS, Azure or GCP) Experience working in Agile and More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment (e.g. Fabric, AWS, GCP, or Azure). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to More ❯
Solr is a strong advantage. Bachelor's or master's degree in computer science, Information Systems , Engineering , or a related field. Deep expertise in Databricks , including Spark (PySpark/Scala) , Delta Lake , and orchestration within Databricks workflows. Strong understanding of cloud infrastructure and data services on at least one major cloud platform (Azure preferred, but AWS or GCP also accepted More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric, or a strong willingness to learn Experience using version control tools like Git and knowledge of CI/CD pipelines Familiarity with software testing methodologies and More ❯
of AWS services including S3, Redshift, EMR, Kinesis and RDS. - Experience with Open Source Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) - Ability to write code in Python, Ruby, Scala or other platform-related Big data technology - Knowledge of professional software engineering practices & best practices for the full software development lifecycle, including coding standards, code reviews, source control management, build More ❯
Extensive experience working with AWS with a strong understanding of Redshift, EMR, Athena, Aurora, DynamoDB, Kinesis, Lambda, S3, EC2, etc. Experience with coding languages like Python/Java/Scala Experience in maintaining data warehouse systems and working on large scale data transformation using EMR, Hadoop, Hive, or other Big Data technologies Experience mentoring and managing other Data Engineers, ensuring More ❯
web/mobile applications or platform with either Java/J2EE or .NET tech stack and database technologies such as Oracle, MySQL, etc. " Exposure to polyglot programming languages like Scala, Python and Golang will be a plus " Ability to read/write code and expertise with various design patterns " Have used NoSQL database such as MongoDB, Cassandra, etc. " Work on More ❯
incl. deep learning, GenAI, LLM, etc. as well as hands on experience on AWS services like SageMaker and Bedrock, and programming skills such as Python, R, SQL, Java, Julia, Scala, Spark/Numpy/Pandas/scikit, JavaScript Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace More ❯
and cloud platforms like AWS or Azure Exposure to SQL/NoSQL databases Leadership or mentoring experience Self-starter, flexible, and eager to learn Bonus: Exposure to C# or Scala Why Join? Be part of a highly collaborative, inclusive team culture Enjoy continuous learning - take up to 5 paid days per year for training, with access to Pluralsight, and support More ❯
Experience in Cloud Data Pipelines Building cloud data pipelines involves using Azure native programming techniques such as PySpark or Scala and Databricks. These pipelines are essential for tasks like sourcing, enriching, and maintaining structured and unstructured data sets for analysis and reporting. They are also crucial for secondary tasks such as flow pipelines, streamlining AI model performance, and enhancing interaction More ❯
infrastructure Strong verbal and written communication skills with the ability to work effectively across internal and external organisations and virtual teams. Would be a plus Skills in Java or Scala BS/BA or equivalent degree in computer science or similar We offer Opportunity to work on bleeding-edge projects Work with a highly motivated and dedicated team Benefits package More ❯
grade Data/AI platforms. This includes working knowledge of ETL, data governance, streaming architectures, Open Table Formats, etc. Proficiency in at least one programming language (e.g., Python, Java, Scala, etc.) plus SQL. CI/CD experience and working knowledge of API development. Ability to lead technical teams and facilitating discussions regarding technical/architectural trade-offs, best practices and More ❯
Senior Data Engineer Location: London, UK (3 days in the office) SC Cleared: Required Essential Skills & Experience: 10+ years of experience in data engineering, with at least 3+ years of hands-on experience with Azure Databricks. Strong proficiency in Python More ❯
and operational efficiency. Requirements: Bachelor's degree in computer science or equivalent practical experience. 3+ years experience building and deploying cloud workloads. 3+ years experience in Java, Golang, or Scala, building tools and refactoring code. Broad knowledge of enterprise security technologies in a high-growth, cloud-based environment. Experience with AWS/GCP, containerised environments, and security architecture alignment. Knowledge More ❯
South East London, London, United Kingdom Hybrid / WFH Options
TEN10 SOLUTIONS LIMITED
and fostering a culture of continuous learning and delivery excellence. Bonus Points For: Previous consultancy or professional services experience. Hands-on experience with additional programming languages like Python, Ruby, Scala, or PHP. Exposure to cloud platforms and infrastructure-as-code within test environment. Whats in It for You? We reward leadership and initiative with a package that reflects your impact More ❯
contributions to the success of both the team and the organization. Nice to Have: Experience with Docker and Kubernetes. Familiarity with Grafana and other monitoring tools. Prior experience with Scala and Java is an advantage. What we offer You will have the chance to be involved in something impactful, large-scale, and meaningful. Most importantly, you will be part of More ❯
industry best practices Demonstrated experience developing teams, encouraging growth, serving as a technical mentor and leader Shows strength and comprehension in at least one programming languages (ex. Java, Python, Scala, Kotlin) Experience making large directional technical decisions (ex. Deciding which technology, or pattern to create or leverage) Experience being "on-call" for a service, and familiarity with incident notification tooling More ❯
Job ID: Amazon (China) Holding Company Limited The AOP (Analytics Operations and Programs) team is responsible for creating core analytics, insight generation and science capabilities for ROW Ops. We develop scalable analytics applications, AI/ML products and research models More ❯
Data Engineer Sr - Informatica ETL Expert page is loaded Data Engineer Sr - Informatica ETL Expert Apply locations Two PNC Plaza (PA374) Birmingham - Brock (AL112) Dallas Innovation Center - Luna Rd (TX270) Strongsville Technology Center (OH537) time type Full time posted on More ❯
also keeps sight of the bigger picture. Minimum Qualifications Strong understanding of foundational computer science algorithms and data structures Experience building API services with a JVM based language - Java, Scala or Kotlin Conceptual understanding of SQL with a view to becoming an expert BS degree in Computer Science or meaningful relevant work experience Preferred Qualifications Experience with large scale data More ❯
complex data landscapes. Agile/Scrum delivery experience. Ideally, involvement in a modern data platform rollout (Fabric/Azure). Strong technical skills in Azure Synapse, Python, Java, or Scala, with deep knowledge of ETL and data architecture. Apply now to lead a skilled team, modernise data capabilities and shape our client’s digital future. More ❯
talent and fostering a collaborative work environment Strong ability to engage with and influence senior stakeholders, understanding their needs and translating them into actionable technical plans Desirable skills in Scala and Spark to support wider programmes of work Hands-on Public Cloud experience in either AWS/Azure/Google Cloud Platform, both their services and how to work in More ❯
at least 7 years experience building and shipping highly-available, fault tolerant, production ready distributed backend systems You have experience in any JVM based languages (such as Java, Kotlin, Scala) and are confident in your ability to build, debug and ship microservices You are customer focused and continuously suggest how the backend can provide the best Customer Experience You pride More ❯