offs explicit and understandable to others REQUIREMENTS 7+ years' coding experience, including 3 years in a dedicated ML Engineering role 2+ years’ experience with Apache Spark Experience working with GB+ scale data Experience with deployed ML services Experience deploying multiple ML projects across different environments Productionisation experience in at more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as Apache Spark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands on more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
Work with Hadoop, Spark, and other platforms for large-scale data processing. Real-Time Data Streaming: Develop and manage pipelines using CDC, Kafka, and Apache Spark. Database Management: Handle SQL databases like Oracle, MySQL, or PostgreSQL. Data Governance: Ensure data quality, security, and compliance with best practices. Ideal Candidate more »
experience in data engineering. Experienced in building ETL data pipelines. Relational database experience w/PostgreSQL. Understanding of tech within our stack: AWS/Apache beam/Kafka. Experience with Object Orientated Programming A desire to work in the commodities/trading sector. Permanent/Full-Time Employment. Hybrid more »
the following: - Experience of working in an Agile product delivery framework - Experience with PHP 8+ and the Laravel framework - Experience with Linux, NGINX (or Apache), MySQL server. LEMP/LAMP stack. - Experience of writing unit tests with test frameworks (PHPUnit, Codeception, etc.) Although we have a dedicated QA Test more »
Linux environments. Knowledge of data modeling, database design, and query optimization techniques. Experience with real-time data processing, streaming, and analytics technologies (e.g., Kafka, Apache Flink). Familiarity with financial markets, trading systems, and quantitative analysis is a plus. Excellent problem-solving, analytical, and communication skills, with the ability more »
testing, and maintenance of data pipelines and data storage systems on Google Cloud Platform (GCP). You will be working with technologies such as Apache Airflow, BigQuery, Python, and SQL to transform and load large data sets, ensuring high data quality and accessibility for business intelligence and analytics purposes. more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in Apache Airflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins, GitLab more »
and IAM. Experience with containerization and orchestration tools, particularly Kubernetes. Proficiency in infrastructure as code tools such as Terraform, Ansible, or CloudFormation. Experience in Apache Airflow, AWS Backup & S3 versioning Solid understanding of CI/CD concepts and experience implementing CI/CD pipelines using tools like Jenkins, GitLab more »
Databricks • Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). • In-depth knowledge on key technologies like Big Query/Redshift/Synapse/Pub Sub/Kinesis/ more »
Skills: 3+ years Drupal Database Technologies (ie SQL) Understanding of PHP and multiple frameworks Knowledge of front-end JavaScript frameworks Experience with Linux and Apache Benefits: 31 days holiday + bank holidays Life assurance policy Up to 16% pension contribution Gym membership discount Ability to purchase additional annual leave more »
Agile software development and system architecture within the Telco OSS domain, with preferred experience in Network GIS (Hexagon, IQ Geo) and workflow tooling (Appian, Apache Airflow). Strong understanding of platform and product dynamics, including Platform Engineering and its relevance to OSS. Extensive background in DevOps practices, encompassing test more »
teams to support the orchestration of our ETL pipelines using Airflow and manage our tech stack including Python, Next.js, Airflow, PostgreSQL MongoDB, Kafka and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and quickly resolving production issues. Contribute to more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
engineers of varying levels of experience. Flexibility and willingness to adapt to new software and techniques. Nice to Have Experience working with projects in Apache Spark, Databricks of similar. Expert cloud platform knowledge, e.g. Azure What will be your key responsibilities? A technical expert and leader on the Petcare more »
master and meta data management Experience with Azure SQL Database, Azure Data Factory, Azure Storage, Azure IaaS/PaaS related database implementations. Experience with Apache spark and new Fabric framework would be a plus. more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. Apache Airflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming components. more »
4.Monitoring and Logging: Implement and maintain monitoring, logging, and alerting solutions. Key technologies: AWS, VPN, VPC Peering, EC2, S3, Lambda, Aurora, Docker/Kubernetes. Apache Airflow, AWS networking concepts such as VPN, VPC peering, subnets, security groups, NAT gateways. AWS CloudWatch or equivalent. Kafka or similar data streaming components. more »
Recent and proven experience of using Red Hat Linux (or others Unix flavours) including scripting in a commercial environment Experience supporting applications (Java, .NET, Apache, IIS); Desirable: Knowledge of Microsoft Windows Server. Experience with working within financial industry; Experience with working within an ITIL framework; Experience with working with more »
as R, Python, Azure, Machine Learning (ML), and Databricks. Essential criteria and experience Proficiency in one or more analytical tools eg R, Python, Tableau, Apache Spark, etc. Proficiency in Azure Machine Learning and Azure Data Bricks. Pro-activity and self-starting attitude. Excellent analytical and problem-solving ability. Interest more »
of ticketing systems Knowledge of Newrelic, Dynatrace, Datadog, or equivalent Knowledge of HTML, CSS, JS, PHP, Java, MySQL Experience maintaining web and application servers (Apache, NGINX, PHP, Redis, Tomcat, Varnish, etc.) Enthusiastic and results driven Punctual and organised Good problem solver Why BORN? We are an award-winning global more »
There will be a particular emphasis in this role on developing information systems within a Microsoft SQL Server development environment and/or an Apache Spark big data processing environment creating algorithms and pipelines to ingest and transform data into information systems and solutions capable of answering clinical and more »
understanding of networking and IP packet structure Experience working in a DevOps team designing, developing and supporting solutions Experience in web-site development using Apache and PHP The successful applicant will work within the network monitoring and intrusion detection & prevention team. Your role will involve working closely with the more »
and Saltstack CI/CD: Jenkins, GitLab CI/CD Data/Messaging: Amazon Aurora (Postgres), ElastiCache (Redis), AmazonMQ (RabbitMQ) API: Tyk API Gateway, Apache Monitoring/Logging: Datadog and SumoLogic Security: IAM (Identity Access Management), Security Groups, mTLS Other: VPC and general networking In return, they would be more »
Hackney, Greater London, Shoreditch, United Kingdom
Talent Smart
role. Proven experience with Snowflake data warehouse, including data loading, transformations, and performance tuning. Strong expertise in ETL tools and processes (e.g., Talend, Informatica, Apache Nifi, etc.). Experience with data visualization tools, particularly Power BI. Excellent problem-solving and analytical skills. Strong communication skills, with the ability to more »