. Experience with ETL/ELT tools, APIs, and integration platforms. Deep knowledge of data modelling, warehousing, and real time analytics. Familiarity with big data technologies principals (e.g., Spark, Hadoop) and BI tools (e.g., Power BI, Tableau). Strong programming skills (e.g. SQL, Python, Java, or similar languages). Ability to exercise a substantial degree of independent professional responsibility More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
Job Description: The Opportunity: The Leonardo Cyber & Security Division, one of the three divisions in Leonardo UK, is a pivotal innovator, helping customers deliver and secure their digital transformation. The Cyber & Security Division is at the forefront of supplying technology More ❯
Experience with SQL, NoSQL, and data visualization tools. Strong analytical and problem-solving skills. Experience with social media analytics and user behavior analysis. Knowledge of big data technologies like Hadoop, Spark, Kafka. Familiarity with AWS machine learning services such as SageMaker and Comprehend. Understanding of data governance and security in AWS. Excellent communication and teamwork skills. Attention to detail More ❯
Job summary We are seeking 3 Data Engineers to join our defence & security client on a contract basis. Key skills required for this role DV cleared, Data Engineer, ETL, Elastic Stack, Apache NiFi Important DV Cleared - Data Engineer - ELK & NiFi More ❯
e.g., Kubernetes). Preferred Skills: Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with model explainability techniques (e.g., SHAP More ❯
e.g., Kubernetes). Preferred Skills: Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with model explainability techniques (e.g., SHAP More ❯
e.g., Kubernetes). Preferred Skills: Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with model explainability techniques (e.g., SHAP More ❯
e.g., Kubernetes). Preferred Skills: Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with model explainability techniques (e.g., SHAP More ❯
e.g., Kubernetes). Preferred Skills: Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with model explainability techniques (e.g., SHAP More ❯
e.g., Kubernetes). Preferred Skills: Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with model explainability techniques (e.g., SHAP More ❯
e.g., Kubernetes). Preferred Skills: Experience with feature stores (e.g., Feast, Tecton). Knowledge of distributed training (e.g., Horovod, distributed PyTorch). Familiarity with big data tools (e.g., Spark, Hadoop, Beam). Understanding of NLP, computer vision, or time series analysis techniques. Knowledge of experiment tracking tools (e.g., MLflow, Weights & Biases). Experience with model explainability techniques (e.g., SHAP More ❯
Telford, Shropshire, West Midlands, United Kingdom Hybrid / WFH Options
JLA Resourcing Ltd
solving and communication skills, including the ability to convey complex concepts to non-technical stakeholders. Desirable (but not essential): Experience with SSIS, AWS or Azure Data Factory. Familiarity with Hadoop, Jenkins, or DevOps practices including CI/CD. Cloud certifications (Azure or AWS). Knowledge of additional programming languages or ETL tools. This is a fantastic opportunity to take More ❯
West Bromwich, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
both on-premise and cloud-based data systems Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
visualization tools such as Tableau, Power BI, or similar to effectively present validation results and insights. Nice-to-Have Requirements Familiarity with big data tools and technologies, such as Hadoop, Kafka, and Spark. Familiarity with data governance frameworks and validation standards in the energy sector. Knowledge of distributed computing environments and model deployment at scale. Strong communication skills, with More ❯
Informatica, or Talend) and scripting (e.g., Python, Shell). Understanding of database security best practices and regulatory requirements (e.g., Experience with NoSQL or big data technologies (e.g., MongoDB, Cassandra, Hadoop). Qualifications: Familiarity with DevOps pipelines and Infrastructure as Code (e.g., Terraform, Liquibase). Certifications such as Microsoft Certified: Azure Database Administrator Associate, Oracle Database SQL Certified Associate, or … AWS Certified Database – Specialty Exposure to BI/reporting tools like Power BI, Tableau, or Looker. Experience with NoSQL or big data technologies (e.g., MongoDB, Cassandra, Hadoop). Active DoD 8570-compliant certification (e.g., Security+, CASP+, CISSP). Experience with cloud platforms such as AWS GovCloud, Azure Government. Familiarity with configuration management tools (e.g., Ansible, Puppet). Experience in More ❯
data points per day and create a highly available data processing and REST services to distribute data to different consumers across PWM. Technologies used include: Data Technologies: Kafka, Spark, Hadoop, Presto, Alloy - a data management and data governance platform Programming Languages: Java, Scala, Scripting Database Technologies: MongoDB, ElasticSearch, Cassandra, MemSQL, Sybase IQ/ASE Micro Service Technologies: REST, Spring … tech stacks SKILLS AND EXPERIENCE WE ARE LOOKING FOR Computer Science, Mathematics, Engineering or other related degree at bachelors level Java, Scala, Scripting, REST, Spring Boot, Jersey Kafka, Spark, Hadoop, MongoDB, ElasticSearch, MemSQL, Sybase IQ/ASE 3+ years of hands-on experience on relevant technologies ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas More ❯
platforms (AWS, Azure, GCP) and deploying models. Ability to use data visualization tools like Tableau or Power BI. Nice-to-Have Skills Knowledge of big data tools such as Hadoop, Kafka, Spark. Understanding of data governance and validation standards in energy. Experience with distributed computing and large-scale deployment. Strong communication skills for explaining complex validation results. Join GE More ❯
with cloud platforms (AWS, Azure, GCP) and deploying models. Ability to use data visualization tools like Tableau or Power BI. Nice-to-Have: Familiarity with big data tools like Hadoop, Kafka, Spark. Knowledge of data governance and validation standards in energy. Experience with distributed computing and large-scale deployment. Strong communication skills for explaining complex validation results. At GE More ❯
Telford, England, United Kingdom Hybrid / WFH Options
hackajob
development environment Experience estimating task effort and identifying dependencies Excellent communication skills Familiarity with Python and its numerical, data and machine learning libraries Favourable If You Have Experience of Hadoop and Jenkins Azure Certified AWS Certified Familiarity with Java This position is a full time, permanent role and applicants must have (or be able to acquire) SC clearance. Ad More ❯
automated testing and deployments using CI/CD pipelines. Continual learning through internal and external training. What you'll bring Mandatory Proficient in at least two of the following: Hadoop, GCP or AWS for creating Big Data solutions. Skilled in using technologies such as Scala or Java. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of More ❯
to line manage small team of junior data engineers. Continual learning through internal and external training. What you'll bring Mandatory Proficient in at least two of the following: Hadoop, GCP or AWS for creating Big Data solutions. Skilled in using technologies such as Scala or Java. Hands-on experience with Continuous Integration and Deployment strategies. Solid understanding of More ❯
building finance systems on cloud platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). ABOUT GOLDMAN SACHS At Goldman Sachs, we dedicate our people More ❯
building finance systems on cloud platforms like AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts, such as Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). ABOUT GOLDMAN SACHS At Goldman Sachs, we dedicate our people More ❯
Experience with cloud technologies such as AWS S3, Snowflake, etc. Preferred Qualifications Interest or knowledge in investment banking or financial instruments. Experience with big data concepts and tools like Hadoop for Data Lake. Experience with near real-time transactional systems like Kafka. Experience in Business Process Management (BPM). About Goldman Sachs At Goldman Sachs, we dedicate our people More ❯