Especially MS Azure is recommended as Microsoft Fabric is integrated within Azure services. Experience of designing robust , secure and compliant capabilities. Strong understanding of Apache Spark, Including its Architecture , Components, and how to create, Monitor, Optimize, and Scale Spark Jobs. Experienced working in a DevOps/Agile Team Experience more »
of ticketing systems Knowledge of Newrelic, Dynatrace, Datadog, or equivalent Knowledge of HTML, CSS, JS, PHP, Java, MySQL Experience maintaining web and application servers (Apache, NGINX, PHP, Redis, Tomcat, Varnish, etc.) Enthusiastic and results driven Punctual and organised Good problem solver Why BORN? We are an award-winning global more »
delivering moderate-to-complex data flows as part of a development team in collaboration with others. You’ll be confident using technologies such as: Apache Kafka, Apache NiFi, SAS DI Studio, or other data integration platforms. You can implement, deliver, and translate several data models, including unstructured data … and recognised standards to build solutions using various traditional or big data languages such as: SQL, PL/SQL, SAS Macro Language, Python, Scala, Apache Spark, Java, JavaScript etc, using various tools including SAS, Hue (Hive/Impala), Kibana (Elastic Search). Knowledge of data management on Cloud platforms more »
understanding of networking and IP packet structure Experience working in a DevOps team designing, developing and supporting solutions Experience in web-site development using Apache and PHP The successful applicant will work within the network monitoring and intrusion detection & prevention team. Your role will involve working closely with the more »
Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval systems for maximum performance using both relational an d NoSQL databases. Continuously monitoring and improving … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
Manchester Area, United Kingdom Hybrid / WFH Options
The Green Recruitment Company
Support colleagues in relation to the delivery of ESG built environment solutions Exhibit thorough expertise in IES-VE, including modules like VE Compliance, Radiance, Apache HVAC, Apache Systems, MacroFlow, MicroFlow, and Vista-Pro, and the ability to extract sustainability outputs (e.g., for BREEAM, LEED) from IES-VE Your more »
Position: DevSecOps Engineer Location: London - Hybrid Salary: Circa £65,000 plus 15% cash flex benefit and company bonus Security Clearance: Must have active UK Security Clearance to apply for this position About the Role: Our prestigious global client is seeking a highly skilled DevSecOps Engineer to join their dynamic team. … analysis. Your expertise will be instrumental in ensuring the security and efficiency of the data handling and reporting processes. Key Responsibilities: Data Processing: Utilize Apache Spark, AWS RDS, and Hadoop to process large datasets efficiently and securely. Reporting: Generate comprehensive and insightful reports using Tableau. Business Rules Management: Implement … and the ability to work collaboratively in a global team environment. Benefits: Competitive Salary: Circa £65,000 per annum. Flexible Benefits Package: 15% cash flex benefit. Performance Bonus: Eligibility for company bonus based on performance. Join a leading global company and take the next step in your career as a more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
data practices Possess strong knowledge of data tools, data management tools, and various data and information technologies. E.g. DAMA DMBOK, Microsoft SQL Server, Couchbase, Apache Druid, Spark, Kafka, Airflow, etc In-depth understanding of modern data principles, methodologies, and tools Excellent communication and collaboration skills, with the ability to … native computing concepts and experience working with hybrid or private cloud platforms is a plus. Demonstrable technical experience working with a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly Apache Spark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal with more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »
Linux environments. Knowledge of data modeling, database design, and query optimization techniques. Experience with real-time data processing, streaming, and analytics technologies (e.g., Kafka, Apache Flink). Familiarity with financial markets, trading systems, and quantitative analysis is a plus. Excellent problem-solving, analytical, and communication skills, with the ability more »
. • Troubleshooting networks issues (tcpdump/Wireshark). • Scripting capabilities (SH/Bash/Python/Perl). • Configuration of common services (DNS/Apache/NGINX/Postfix/Squid/SSH/iptables). • Understanding of clustering services, enabling High Availability failover. • Experience with enterprise hardware and more »
Data Engineer to join our team and build robust data pipelines. Responsibilities: Build, and maintain scalable data pipelines. Maintain the pipeline through Databricks and Apache Beam Seamless data integration and quality. Data storage solutions using GCP (will still be considered with other cloud experience) Best practices in data management more »
within the EU Fusion programme and connections to international HPC communities, showcasing contributions made to the field. Experience in workflow management systems such as Apache Airflow. Familiarity with Research Data Management methodologies,modern database technologies including SQL, NoSQL and Graph Databases, and parallel file access technologies such as MPI more »
and Saltstack CI/CD: Jenkins, GitLab CI/CD Data/Messaging: Amazon Aurora (Postgres), ElastiCache (Redis), AmazonMQ (RabbitMQ) API: Tyk API Gateway, Apache Monitoring/Logging: Datadog and SumoLogic Security: IAM (Identity Access Management), Security Groups, mTLS Other: VPC and general networking In return, they would be more »
pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using Apache Airflow Experience using Docker Experience using Apache Spark Benefits: Salary £40-50K per annum dependant on skills and experience 25 Days annual more »
GoLang. - Significant experience with Hadoop, Spark and other distributed processing platforms and frameworks. - Experience working with Open table/storage formats like delta lake, apache iceberg or apache hudi. - Experience of developing and managing real time data streaming pipelines using Change data capture (CDC), Kafka and Apachemore »
data components such as Azure Data Factory, Azure SQL DB, Azure Data Lake, etc. Strong Python and SQL skills for data manipulation Experience with Apache Spark and/or Databricks. Familiarity with BI visualization tools like Power BI Experience in managing end-to-end analytics pipelines (batch and streaming … such as Azure Data Engineer Associate are desirable. Knowledge of data ingestion methods for real-time and batch processing Proficiency in PySpark and debugging Apache Spark workloads. What’s in it for you? Annual bonus scheme – up to 10% Excellent pension scheme Flexible working Enhanced family friendly policies Ongoing more »
Role : PHP Developer 🗺️ Location : Dudley - 2 days a week in office 💰 Salary : £35,000 - £45,000 ✅ Skills : PHP, Azure, Apache, CentOS, T-SQL My client is expanding and looking for a PHP Developer to join their growing team in the Dudley area. You'll be responsible, for the development … features Troubleshooting web applications Maintaining clear documentation of all work Qualifications 3+ years of experience with hands-on PHP programming Experience with IDE, Azure, Apache, CentOS, T-SQL What's in it for you: Starting salary up to £45,000 Continuous training and development Flexible working Please note : the more »
Senior QA Automation Tester - Cypress Job Title: QA Automation Engineer Location: South West London Salary: £60,000 per year + Benefits Job type: Permanent Role: Hybrid A long established Fintech company based in London, renowned for its cutting-edge payment more »
My client is a leading global technology consulting and digital solutions company who specializes in sectors such as banking, insurance, manufacturing, and healthcare. They leverage advanced technologies like cloud computing, AI, and data analytics to deliver scalable, cutting-edge solutions. more »
My client is a leading global technology consulting and digital solutions company who specializes in sectors such as banking, insurance, manufacturing, and healthcare. They leverage advanced technologies like cloud computing, AI, and data analytics to deliver scalable, cutting-edge solutions. more »