the following requirements: Strong Data Technology skills, with the ability to build, operate, maintain and support loud infrastructure & data services at scale Extensive experience of Big Data Technologies (Databricks, Spark etc) Cloud Infrastructure & Platform Engineering experience (Azure preferred) The ability to conduct cybersecurity industry research and rapidly develop data-driven prototypes for live business problems faced by Cybersecurity teams More ❯
there's nothing we can't achieve in the cloud. BASIC QUALIFICATIONS 5+ years of experience in cloud architecture and implementation 5+ years of database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience 5+ years of experience delivering cloud projects or cloud-based solutions Ability to communicate effectively in English, in technical and business settings Bachelor's degree in More ❯
there's nothing we can't achieve in the cloud. BASIC QUALIFICATIONS - 5+ years of experience in cloud architecture and implementation - 5+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 5+ Experience delivering cloud projects or cloud based solutions - Able to communicate effectively in English, within technical and business settings. - Bachelor's degree in Business, Computer More ❯
and listed on the London Stock Exchange. With 3,000 employees and 32 offices in 12 countries we're a business with lots of opportunity for people with talent, spark and lots of ambition. If you want to build a great career with a company that prioritises strong values - such as integrity and courage - where our people always pull More ❯
and listed on the London Stock Exchange. With 3,000 employees and 32 offices in 12 countries we're a business with lots of opportunity for people with talent, spark and lots of ambition. If you want to build a great career with a company that prioritises strong values - such as integrity and courage - where our people always pull More ❯
Proficiency in a systems programming language (e.g., Go, C++, Java, Rust). Experience with deep learning frameworks like PyTorch or TensorFlow. Experience with large-scale data processing engines like Spark and Dataproc. Familiarity with data pipeline tools like dbt. Benefits Flexible Working Hours & Remote-First Environment - Work when and where you're most productive, with flexibility and support. Comprehensive More ❯
preferred. Strong grasp of MLOps/LLMOps principles, including CI/CD for ML, model monitoring, and governance frameworks. Proficiency with large-scale data processing and storage technologies (SQL, Spark, Hadoop) is a plus. Excellent stakeholder management and communication skills, with proven ability to translate complex AI concepts for diverse audiences. Connect to your business - Technology and Transformation Distinctive More ❯
data engineering practices Oversee the integration and processing of financial market data feeds (transactions, trading, asset management, FX) Work with teams to design and maintain distributed data pipelines using Spark and Aurora PostgreSQL Act as a trusted advisor to business and technology stakeholders, translating business requirements into technical solutions Drive adoption of best practices in testing, CI/CD … financial services data platform projects Strong expertise in Python for data-intensive systems Deep understanding of financial market data (transaction feeds, asset management systems, trading platforms, FX) Experience with ApacheSpark for distributed data processing Hands-on knowledge of Aurora PostgreSQL (or equivalent relational databases) Extensive experience with AWS cloud services for data platform builds Strong stakeholder engagement More ❯
through data science projects Awareness of data security best practices Experience in agile environments You would benefit from having: Understanding of data storage and processing design choices Familiarity with ApacheSpark or Airflow Experience with parallel computing Candidates should be able to reliably commute or plan to relocate to Coventry before starting work. The role requires a Data More ❯
logic SQL (PostgreSQL/SQL Server) for data querying and pipelines React (TypeScript) for intuitive, modern UIs Exposure to cloud platforms (AWS/Azure), Docker, or streaming tools (Kafka, Spark, etc.) is a plus Ideal Profile: 1-2 years' experience in a commercial software or data engineering role Strong coding skills in Python, SQL, and React Keen to work More ❯
here to help you develop into a better-rounded professional. Basic Qualifications - 7+ years of technical specialist, design and architecture experience - 5+ years of database (e.g., SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 7+ years of consulting, design and implementation of serverless distributed solutions experience - 5+ years of software development with object oriented language experience - 3+ years of cloud More ❯
and at home, there's nothing we can't achieve. BASIC QUALIFICATIONS - 10+ years of technical specialist, design and architecture experience - 10+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 10+ years of consulting, design and implementation of serverless distributed solutions experience - Australian citizen with ability to obtain security clearance. PREFERRED QUALIFICATIONS - AWS Professional level certification More ❯
some of the brightest technical minds in the industry today. BASIC QUALIFICATIONS - 10+ years of technical specialist, design and architecture experience - 10+ years of database (eg. SQL, NoSQL, Hadoop, Spark, Kafka, Kinesis) experience - 10+ years of consulting, design and implementation of serverless distributed solutions experience - Australian citizen with ability to obtain security clearance. PREFERRED QUALIFICATIONS - AWS Professional level certification More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Deloitte LLP
solutions from structured and unstructured data. Build data pipelines, models, and AI applications, using cloud platforms and frameworks such as Azure AI/ML Studio, AWS Bedrock, GCP Vertex, Spark, TensorFlow, PyTorch, etc. Build and deploy production grade fine-tuned LLMs and complex RAG architectures. Create and manage the complex and robust prompts across the GenAI solutions. Communicate effectively More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Deloitte LLP
solutions from structured and unstructured data. Build data pipelines, models, and AI applications, using cloud platforms and frameworks such as Azure AI/ML Studio, AWS Bedrock, GCP Vertex, Spark, TensorFlow, PyTorch, etc. Build and deploy production grade fine-tuned LLMs and complex RAG architectures. Create and manage the complex and robust prompts across the GenAI solutions. Communicate effectively More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid / WFH Options
Deloitte LLP
solutions from structured and unstructured data. Build data pipelines, models, and AI applications, using cloud platforms and frameworks such as Azure AI/ML Studio, AWS Bedrock, GCP Vertex, Spark, TensorFlow, PyTorch, etc. Build and deploy production grade fine-tuned LLMs and complex RAG architectures. Create and manage the complex and robust prompts across the GenAI solutions. Communicate effectively More ❯
Guildford, Surrey, United Kingdom Hybrid / WFH Options
Deloitte LLP
solutions from structured and unstructured data. Build data pipelines, models, and AI applications, using cloud platforms and frameworks such as Azure AI/ML Studio, AWS Bedrock, GCP Vertex, Spark, TensorFlow, PyTorch, etc. Build and deploy production grade fine-tuned LLMs and complex RAG architectures. Create and manage the complex and robust prompts across the GenAI solutions. Communicate effectively More ❯
Data Engineer Sr - Informatica ETL Expert page is loaded Data Engineer Sr - Informatica ETL Expert Apply locations Two PNC Plaza (PA374) Birmingham - Brock (AL112) Dallas Innovation Center - Luna Rd (TX270) Strongsville Technology Center (OH537) time type Full time posted on More ❯
and user needs Qualifications 5+ years of hands-on experience with Python, Java and/or C++ Development of distributed systems Kubernetes (K8s) AWS (SQS, DynamoDB, EC2, S3, Lambda) ApacheSpark Performance testing Bonus Search system development (indexing/runtime/crawling) MLOps development and/or operations The cash compensation range for this role is More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Zachary Daniels
About You: 8+ years in data leadership roles, with proven success building and scaling modern data teams. Expert in Azure Synapse, Databricks, and Power BI, with strong SQL, Python, Spark, and data modelling skills. Deep knowledge of cloud-native data architecture, data lakes, and streaming pipelines. Strong grasp of predictive analytics, AI, and machine learning applications. Excellent stakeholder management More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Zachary Daniels
About You: · 8+ years in data leadership roles, with proven success building and scaling modern data teams.· Expert in Azure Synapse, Databricks, and Power BI, with strong SQL, Python, Spark, and data modelling skills.· Deep knowledge of cloud-native data architecture, data lakes, and streaming pipelines.· Strong grasp of predictive analytics, AI, and machine learning applications.· Excellent stakeholder management More ❯
current cyber security threats, actors and their techniques Experience with data science, big data analytics technology stack, analytic development for endpoint and network security, and streaming technologies (e.g., Kafka, Spark Streaming, and Kinesis) Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your More ❯
the latest tech, serious brain power, and deep knowledge of just about every industry. We believe a mix of data, analytics, automation, and responsible AI can do almost anything-spark digital metamorphoses, widen the range of what humans can do, and breathe life into smart products and services. Want to join our crew of sharp analytical minds? You'll More ❯
education None Preferred education Bachelor's Degree Required technical and professional expertise Design, develop, and maintain Java-based applications for processing and analyzing large datasets, utilizing frameworks such as Apache Hadoop, Spark, and Kafka. Collaborate with cross-functional teams to define, design, and ship data-intensive features and services. Optimize existing data processing pipelines for efficiency, scalability, and … degree in Computer Science, Information Technology, or a related field, or equivalent experience. Experience in Big Data Java development. In-depth knowledge of Big Data frameworks, such as Hadoop, Spark, and Kafka, with a strong emphasis on Java development. Proficiency in data modeling, ETL processes, and data warehousing concepts. Experience with data processing languages like Scala, Python, or SQL. More ❯
we expect from you Master's degree/PhD in Computer Science, Machine Learning, Applied Statistics, Physics, Engineering or related field Strong mathematical and statistical skills Experience with Python. Spark and SQL Experience implementing and validating a range of machine learning and optimization techniques Effective scientific communication for varied audiences Autonomy and ownership of projects Good understanding of software More ❯