Durham, County Durham, North East, United Kingdom Hybrid / WFH Options
Reed Technology
a data science team and managing complex projects. Expertise in machine learning, statistics, data management, and relevant technologies (e.g., Python, R, SQL, AWS SageMaker, Apache Airflow, Dbt, AWS Kinesis). Strong communication skills with the ability to explain complex data concepts to a non-technical audience. Knowledge of industry more »
up and learn new technologies and frameworks Nice to have: Knowledge of databases, SQL Familiarity with Boost ASIO Familiarity with data serialization formats such Apache Arrow/Parquet, Google Protocol Buffers, Flatbuffers Experience with gRPC, http/REST and Websocket protocols Experience with Google Cloud/AWS and/ more »
Other skills we are looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence in more »
integration architecture and design skills Good communication skills Desirable Skills: JavaScript – with React/Vue being even better. Docker/Kubernetes Linux – Basic sysadmin (Apache, Nginx) SQL/Oracle/PostgreSQL/MongoDB/DynamoDB Message Queues – RabbitMQ or similar AWS or GCP This an office based position more »
offs explicit and understandable to others REQUIREMENTS 7+ years' coding experience, including 3 years in a dedicated ML Engineering role 2+ years’ experience with Apache Spark Experience working with GB+ scale data Experience with deployed ML services Experience deploying multiple ML projects across different environments Productionisation experience in at more »
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
Drupal Magento BigCommerce Laravel Proficient in setting up development and staging environments. Proficient in using and altering MySQL databases. Familiarity with server structures, specifically Apache and Nginx. Familiarity with domain management and DNS records. Familiarity with Agile and Waterfall work environments. Nice to haves: Familiarity with digital marketing/ more »
the Midlands. Ideal Candidate Profile: We are seeking an individual who have the following attributes: Proven expertise as a Data Engineer, demonstrating proficiency in Apache Spark and cloud-based technologies, particularly Microsoft Azure and Databricks. Strong programming skills, with a focus on Python, along with proficiency in ETL frameworks more »
Leeds, England, United Kingdom Hybrid / WFH Options
Candour Solutions
employees Driving change and handling difficult situations Dealing with change on a daily basis Ability to demonstrate the value of changes introduced Web servers - Apache, nginx, IIS, Drupal Databases - Oracle, MYSQL, MSSQL Traffic Management - F5, Haproxy, Keepalived CDNS - CDNetworks, Akamai, Incapsula Scripting - Bash, Bat, Python, Perl Programming Languages - PHP more »
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
MBN Solutions
performant data pipelines. Proven hands-on experience delivering and maintaining enterprise-grade data pipelines utilising technologies such as Azure Data Factory, Pentaho Data Integrator, Apache Hop, etc. Demonstrable understanding of modern ETL/ELT practices, frameworks, tooling and execution environments. Demonstrable understanding of data delivery approaches like batch, micro more »
in programming languages commonly used in machine learning, preferably Python. Experience with machine learning frameworks and libraries, such as TensorFlow, PyTorch, scikit-learn, or Apache Spark. Proven track record of developing and implementing machine learning solutions in a professional setting. Passion for exploring new technologies and driving innovation in more »
build their AI practice and a team around you. Required Skills: Building cloud and native machine learning architecture with: LLamaIndex, HuggingFace, SentenceTransformers, PyTorch, Python, Apache Spark. Experience with practical application of AI and scaling AI with these tools Experience in Health Care is essential We would love to share more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.Our Commitment to Diversity and InclusionAt Databricks, we are committed more »
objectives. So each team leverages the technology that fits their needs best. You’ll see us working with data processing/streaming frameworks like Apache Flink and Spark; Database technologies like MySQL, PostgreSQL, DynamoDB and Redis; and breaking things using in-house chaos principles and tools such as Gatling … latency, near real-time products: Java and Scala based Web Services, Databricks Data Lakes (Delta Lakes), AWS Kinesis and MSK, AWS ElasticSearch, AWS RDS, Apache Flink & Spark, scripting using Python, Terraform’s infrastructure as a code. The interview process Our interview aims to take a relaxed & practical approach that more »
Elasticsearch, Bigquery, PostgresQL FullCircl 3 Lead_Data_Engineer 04.24 · Kubernetes, Docker, Airflow KEY RESPONSIBILITIES · Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. · Optimizing data storage and retrieval systems for maximum performance using both relational and NoSQL databases. · Continuously monitoring and improving the … Data Infrastructure projects, as well as designing and building data intensive applications and services. · Experience with data processing and distributed computing frameworks such as Apache Spark · Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin · Deep knowledge of data modelling, data access, and data more »
Reading, England, United Kingdom Hybrid / WFH Options
CACTUS IT SOLUTIONS
Exhibit proficiency in Java concurrency for parallel test execution, providing valuable expertise to enhance testing efficiency. • Java Libraries: Demonstrate advanced usage of libraries like Apache Commons, Guava, etc., showcasing the ability to select and implement libraries for optimized efficiency. • Automation Frameworks: Work with Selenium WebDriver, JUnit, TestNG, Cucumber BDD … advanced frameworks, contributing to their enhancement and customization to meet project requirements. • REST API Testing: Utilize advanced API testing tools like Rest Assured and ApacheHTTP Client, showcasing in-depth knowledge of API testing. • Framework Implementation: Contribute to the enhancement and customization of automation frameworks, ensuring they align with more »
required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI more »
required) Experience with distributed message brokers using Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI more »
EC1N, Farringdon Without, Greater London, United Kingdom
Damia Group Ltd
SC Cleared DevSecOps Engineer - London Hybrid - £ 50,000 - 65,000 base salary plus 15% cash flex (guaranteed income and can be taken as cash or used to buy extra benefits) plus 10% bonus and core benefits including pension, life insurance, private medical and disability benefit Fantastic opportunity to join a … skilled SC Cleared DevSecOps Engineer to join their team where your key responsibilities will be to : Manage operational procedures. Transform and process data using Apache Spark. Administer AWS RDS with MySQL. Work with the Hadoop platform. Create reports using Tableau. Utilize Red Hat Decision Central Skills/Experience Required … of the SC Cleared DevSecOps Engineer: Strong operational procedures knowledge. Proficient in Apache Spark, AWS RDS (MySQL), and Hadoop. Knowledge of Tableau and Red Hat Decision Central If this SC Cleared DevSecOps Engineer role sounds a good fit for you get in touch now for a prompt discussion on more »
Employment Type: Permanent
Salary: £50000 - £65000/annum 15% cash flex and 10% bonus