Java Software Engineer (Developer Programmer Java Python Automation Data Lake Datalake Data Mesh CI/CD Kafka BigData AWS GCP SQL Finance Trading Contract Contractor Consultant London Financial Services Banking Remote Working AWS Trading Cloud Projects Dremio Dagster Iceberg Kotlin DBT) required by our financial … equivalent AWS or GCP Strong database knowledge The following is DESIRABLE, not essential: Finance Python Role: Java Software Engineer (Developer Programmer Java Python Automation Data Lake Datalake Data Mesh CI/CD Kafka BigData AWS GCP SQL Finance Trading Contract Contractor Consultant London Financial Services … Working AWS Trading Cloud Projects Dremio Dagster Iceberg Kotlin DBT) required by our financial services client in Dublin, Ireland. You will join a central data engineering team of 8 that are working on a 2-3 project to migrate their AWS based data lake to a datamore »
London, Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
About the Role: I am looking for a talented Data Engineer to join our clients dynamic team on a 6-month contract basis. This is an exciting opportunity for a mid-level professional with 3-5 years of experience to make a significant impact on our data infrastructure … remote/hybrid working options and a competitive day rate of £250-£400, falling inside IR35 regulations. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using AWS, Databricks, Python, Spark, and SQL. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize and troubleshoot data workflows to ensure efficiency and reliability. Implement data quality and validation checks to ensure data integrity. Contribute to the development and maintenance of our data lake and data warehouse solutions on AWS. more »
Vulnerability and Compliance Engineer who will assist the SEA team with improving and maintaining the Bank's vulnerability and compliance tooling. Transforming and presenting data within the Bank's bigdata platform to the Governance and Assurance Team (and others). Developing custom compliance checks where they … script in both Python and PowerShell, understands modern development practices such as version control and CI/CD pipelines, and has a background in data flows (eg ETL). They will be passionate about security vulnerability management and compliance scanning, and passionate about turning data into actionable processes … standards. Implement custom logic to prioritise vulnerabilities and compliance issues. Integrate tool output (eg from a compliance tool or system) into the Bank's bigdata platform, used for compliance. Implement queries (SQL or similar) to extract relevant data for reporting and alerting. Perform application patching of more »
Milton Keynes, Buckinghamshire, South East, United Kingdom
Maclean Moore Ltd
Kafka Developer Duration: 6 Months Location: Milton Keynes Key responsibilities: Demonstrable experience as an Kafka Developer (Ideally Kafka Streams). Hand on experience in Bigdata technologies (Hadoop, Hue, Hive, Impala,Spark,.) and stronger within Kafka. Knowledge and experience using Key value data bases . Experience … developing microservices using Spring. Design and develop cloud-based data engineering solutions Java/Scala, Openshift/Kubernetes , deployments in Jekins The Candidate will be part of the team responsible for progress and support of one of the main streaming in house products . As well will be responsible … for creating a scalable configurable streaming applications to provide fresh data as part of data services for different applications ,usually 24/7 applications with a big performance requirement . Key skills/knowledge/experience: BigData Hadoop - Hive and Spark/Scala solid more »
Data engineer - Python - Azure - Inside IR35 - £500 per day - 6 months Exalto Consulting are currently recruiting for a contract Data Engineer for a client, 100% remote working inside IR35 paying £500 per day, initially 6 months Skills required for the role: Bigdata experience Data inventory and data familiarisation Efficient data ingestion and ingestion pipelines Data cleaning and transformation Databricks (ideally with Unity Catalog) Python and PySpark CI/CD (ideally with Azure Devops) Unit testing (PyTest) If you have the above experience and are looking for a new contract role … please send your CV for immediate consideration as our client are looking to hire ASAP Data engineer - Python - Azure - Inside IR35 - £500 per day - 6 months more »
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Experis
digital and platforms. Role purpose/summary Collaborate with stakeholders to understand business requirements and translate them into effective solution designs for our party data model data platform. Architect and implement cloud-based solutions that leverage popular data platforms to enable efficient storage, processing, and analysis of … large-scale data. Design scalable and resilient architectures that ensure high availability, fault tolerance, and disaster recovery for our data platform. Assess and select appropriate data platforms and technologies based on functional and non-functional requirements, considering factors such as performance, scalability, security, and cost-effectiveness. Provide guidance … of the solution design. Key Skills/requirements Proven experience as a Solution Architect, designing and implementing cloud-based solutions. Strong knowledge of popular data platforms such as Amazon Web Services (AWS) (e.g., S3, RDS, Redshift, Athena), Microsoft Azure (e.g., Blob Storage, SQL Database, Data Lake Analytics), or more »
Our client is looking for a ServiceNow Engineer who will be accountable for enriching, managing, and ensuring the quality of organizational and product usage data, with a focus on efficient query writing. They will enforce data governance policies and best practices, as well as develop documentation on data … Inside IR35 6 Months + Extension Skills/Experience Recent commercial experience with developing on the ServiceNow platform Experience of GLIDE Good background in bigdata analytics, database management, or an equivalent analyst position Proficiency in PL/SQL and an additional object-oriented programming language (Highly desirable … Experience in bigdata instances such as Cloudera, Azure, Snowflake, etc. Structured thinking with the ability to break down ambiguous problems and propose impactful data modeling designs Ability to roll out analytics dashboards (e.g., Power BI, Tableau Server) Ability to translate business needs into technical requirements Additional more »
and troubleshooting of the SOC's technology stack (hardware and software). The engineer will also assist with the continued development and maintenance of data pipelines and signature updates and the professional development of the system engineering team. Tasks: Perform system administration on specific cyber defence applications and systems … defence network tools in response to new or observed threats within the network environment or enclave. Manage the compilation, cataloguing, distribution, and retrieval of data from a range of enterprise networks and data sources. Implement data management standards, requirements, and specifications. Develop data standards, policies, and … procedures. Analyse data sources to provide actionable recommendations and facilitate data-gathering methods. To share knowledge, skills and experience, create and improve documentation, and train new members of the data engineering team. Knowledge: Knowledge of bigdata technologies and ecosystems (eg, NiFi). Knowledge of more »
Computer Futures - London & S.E(Permanent and Contract)
and troubleshooting of the SOC's technology stack (hardware and software). The engineer will also assist with the continued development and maintenance of data pipelines and signature updates and the professional development of the system engineering team. Tasks: * Perform system administration on specific cyber defence applications and systems … defence network tools in response to new or observed threats within the network environment or enclave. * Manage the compilation, cataloguing, distribution, and retrieval of data from a range of enterprise networks and data sources. * Implement data management standards, requirements, and specifications. * Develop data standards, policies, and … procedures. * Analyse data sources to provide actionable recommendations and facilitate data-gathering methods. * To share knowledge, skills and experience, create and improve documentation, and train new members of the data engineering team. Knowledge: * Knowledge of bigdata technologies and ecosystems (eg, NiFi). * Knowledge of more »
and troubleshooting of the SOC's technology stack (hardware and software). The engineer will also assist with the continued development and maintenance of data pipelines and signature updates and the professional development of the system engineering team. Tasks: * Perform system administration on specific cyber defence applications and systems … defence network tools in response to new or observed threats within the network environment or enclave. * Manage the compilation, cataloguing, distribution, and retrieval of data from a range of enterprise networks and data sources. * Implement data management standards, requirements, and specifications. * Develop data standards, policies, and … procedures. * Analyse data sources to provide actionable recommendations and facilitate data-gathering methods. * To share knowledge, skills and experience, create and improve documentation, and train new members of the data engineering team. Knowledge: * Knowledge of bigdata technologies and ecosystems (e.g., NiFi). * Knowledge of more »
out and implemented by a Sales focused Cloud Architect with an ability to secure UK eDV security clearance. With Machine Learning techniques, Encryption and BigData tools, this firm have solved a critical privacy issue that will benefit National Security and Intelligence clients across the world and is … equally impactful for other sectors, allowing private access to Neural Networks and Data sets for knowledge growth. If you are interested in delivering impactful product presentations and able to highlight mission-critical benefits of a software product, you will be highly valued in this role. You'll need to … and consultative selling of B2B software Demonstrable experience working with Privacy technologies such as homomorphic encryption, SMPC, Federated Learning, TEE. Any prior experience with Data management or tooling hugely beneficia (Data Lakes, Snowflake or Databricks). Experience working closely with engineering teams (particularly those working on Machine Learning more »
and troubleshooting of the SOC's technology stack (hardware and software). The engineer will also assist with the continued development and maintenance of data pipelines and signature updates and the professional development of the system engineering team. Responsibilities: Perform system administration on specific cyber defence applications and systems … defence network tools in response to new or observed threats within the network environment or enclave. Manage the compilation, cataloguing, distribution, and retrieval of data from a range of enterprise networks and data sources. Implement data management standards, requirements, and specifications. Develop data standards, policies, and … procedures. Analyse data sources to provide actionable recommendations and facilitate data-gathering methods. To share knowledge, skills, and experience, create and improve documentation, and train new members of the data engineering team. Key Skills: Previous experience of Enterprise ICS/network architectures and technologies. Working with frameworks more »
settings comply with company policies and standards Developing custom logic to prioritize vulnerabilities and compliance issues Integrating compliance tool output into the company's bigdata platform for compliance purposes Executing queries to extract data for reporting and alerting Performing application patching for the company's vulnerability … and compliance applications Your experience/knowledge: Proficient in Python with experience in PowerShell, integration work, and processing structured data such as JSON and CSV for data importing and exporting Strong background in cybersecurity, with a focus on vulnerability and compliance management Experienced with data pipelines, workflows …/CD pipelines Ability to understand user requirements and implement them effectively within an existing tech stack Knowledge of integration and processing of structured data for seamless data platform operations Familiarity with tools such as Dataiku DSS, Tableau, and the Azure security ecosystem, especially Defender suite, is a more »
the client - this will include transfer of knowledge and mentoring of less experienced members of the team. You will have a solid record in BigData Engineering, with experience in delivering complex solutions for Enterprise level Clients. As well as maintaining robust and scalable applications in Scala, you … will be able to implement ETL pipelines to process, transform, and standardize data from various sources as well as optimise the performance of Spark applications. Work closely with data scientists, software engineers, and machine learning experts to enhance the data platform and contribute to the development of … ideally in Google Cloud Platform, but experience with other cloud environments is acceptable). Required Skills: Strong proficiency in Scala and Spark. Experience with data visualization tools. Familiarity with cloud platforms (GCP preferred, but AWS or Azure also acceptable). Knowledge of Kubernetes, Kafka, and other relevant technologies. Previous more »
As a Data Architect, you'll lead the development of Java and Python projects, design API integrations using Spark, and collaborate with clients and internal teams to translate business requirements into high and low-level designs. You'll also define architecture and technical designs, create data flows and … Python project development. Design and develop API integrations using Spark. Collaborate with clients and teams to understand requirements. Define architecture and technical designs. Design data flows and integration's using Hadoop. Implement testing and develop comprehensive documentation. Provide training and support to end-users and client teams. Stay updated more »
Chertsey, Surrey, South East, United Kingdom Hybrid / WFH Options
Polar Recruitment Services Ltd
best known and admired brands in the world, working within a continuing growing Marketing team. You will be responsible for creating quality software and data structures that meet the functional and non-functional project requirements in the implementation, enhancement, and support of marketing projects. This will include producing application … general industry & platform-specific best practices. Skills & experience: Experience designing, implementing, and supporting enterprise-grade technical solutions in the cloud for meeting complex business data requirements. Working knowledge and experience with BigData platforms, Adobe Experience Platform or similar CDP. Third normal form, star schema, Snowflake, etc. … Experience with data modelling, table design, and mapping business needs to data structures. In depth knowledge in one or more programming languages (e.g. C++, Java, Python, R, PHP) Up to date understanding of best practices regarding system security measures. Hours of work: Flexible working pattern within a 37.5 more »
We are partnered with a reputable global consultancy that are recruiting AWS DATA ENGINEERS to work on a very exciting LONG-TERM CONTRACT within the financial service sector. Role: AWS Data Engineer Rate: Up to £425 per day (inside IR35) Location: London Hybrid Duration: 6 months (initially view … responsibilities: Build code as per the given requirement. Ensure to develop unit test cases. Help in backlog grooming. Key skills: Extensive experience in developing Bigdata pipelines in cloud using Bigdata technologies such as Apache Spark Expertise in performing complex data transformation using Spark SQL queries Experience in orchestrating data pipelines using Apache Airflow Proficiency in Git based version control tools Proficiency with Linux commands and Bash Scripting Working experience in AWS Bigdata services (EMR, Glue, Data Pipelines, Athena, S3, Step Functions etc.) & AWS CLI Experience with CI/CD tools such as Jenkins Experience working with relational more »
Hive SQL Queries/Spark SQL Statements. * Experience in working with large databases - multi terabytes (3+ Terabytes). * Minimum of 5 years' experience in BigData Space (Hive, Impala, Spark Sql, HDFS etc). * Any cloud experience (AWS/Azure/Google/Oracle). * Solid experience with more »
Hive SQL Queries/Spark SQL Statements. * Experience in working with large databases - multi terabytes (3+ Terabytes). * Minimum of 5 years' experience in BigData Space (Hive, Impala, Spark Sql, HDFS etc). * Any cloud experience (AWS/Azure/Google/Oracle). * Solid experience with more »
My client is adigital, cloud, bigdata and security consultancy and will be a global leader in data-driven, trusted and sustainable digital transformation. As a next generation digital business with worldwide leading positions in digital, cloud, data, advanced computing and security, it brings deep expertise more »