Reading, Berkshire, United Kingdom Hybrid / WFH Options
Syntax Consultancy Ltd
WebDriver, JUnit, TestNG, Cucumber BDD and advanced frameworks , contributing to their enhancement and customization to meet project requirements. A deep understanding of Java Libraries : Apache Commons, Guava etc. with the ability to select and implement libraries for optimized efficiency. Technical Environment: REST API Testing , Utilizing advanced API testing tools … like RestAssured and Apache HttpClient, in-depth knowledge of API testing. Apply common Design Patterns in test Automation, an advanced understanding of exception handling and error management in Java . Best Practices : Advanced Java coding standards and practices, ensuring superior code quality. Establish and enforce advanced Java best practices more »
problem-solving and communication skills Proficiency in scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with Apache Airflow for managing complex data processing work-flows. Solid understanding of software development best practices, including version control (Git), testing, and code review processes. more »
Leatherhead, Surrey, South East, United Kingdom Hybrid / WFH Options
RINA
experiencing a real breadth and variety of project work on some of the most technically advanced platforms in UK Defence, including unmanned air systems, Apache, Wildcat, Chinook and Typhoon, to name a few. The successful candidate support and deliver Systems Engineering and Assurance projects, for technically interlinked assets and more »
Telford, Shropshire, West Midlands, United Kingdom
RECRUIT123 LIMITED
Caching (Redis) It's an advantage if you also have experience with: JavaScript frameworks such as Vue/React or similar Linux server management Apache or Nginx DNS, SSL & Domain management What the role involves: Being part of a development team that is responsible for all aspects of ongoing more »
of financial markets UI skills a plus (JavaScript/Angular/React etc.) Graph database experience Big data technology stack such as Spark, HDFS, Apache Flink etc. Data domain modelling Working knowledge of UNIX/Linux ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas more »
Experience building and running monitoring infrastructure at a large scale. For example, Elasticsearch clusters, Prometheus, Kibana, Grafana, etc Web applications and HTTP servers Java, apache, nginx Load balancers ELB, HAProxy, nginx Experience in running SQL/NoSQL data stores RDS, DynamoDB, ElastiCache, Solr Perks of Working at Viator Competitive more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Our Commitment to Diversity and Inclusion At Databricks more »
ML libraries (TensorFlow, pytorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), CV libraries (scikit-image, OpenCV, yolo, Detectron2). AWS, Postgres, Apache Airflow, Apache kafka, Apache Spark. Mandatory requirements: You have at least 5 years of experience in DS role, deploying models into production more »
ML libraries (TensorFlow, pytorch, scikit-learn, transformers, XGBoost, ResNet), geospatial libraries (shapely, geopandas, rasterio), CV libraries (scikit-image, OpenCV, yolo, Detectron2). AWS, Postgres, Apache Airflow, Apache kafka, Apache Spark Mandatory requirements: You have at least 5 years of experience in the DS role, deploying models into more »
tasks, both for oneself and others, ensuring progress aligns with project goals. Nice to have: Previous experience in startup environments. Proficiency or experience with Apache Spark. Familiarity or background in working with Azure. Experience orchestrating workflows, particularly within distributed system environments. Knowledge of MLOps principles and practices, especially in more »
SQL Data Warehouse, Azure Data Lake, Azure Databricks Azure Cosmos DB, Azure Data Factory, Azure Search, Azure Stream Analytics Delta Lake and Data Lakes Apache Spark Pools, SQL Pools (dpools and spools) Experience in Python, C# coding, Spark, PySpark, Unix shell/Perl scripting experience. Experience in API data more »
notebooks, Spark, and Docker Professional experience of designing, building and managing bespoke MLOps in cloud environments, using tools such as SageMaker Processing, SageMaker Pipelines, Apache Airflow, and MLflow. Strong, fundamental technical expertise in cloud-native technologies, such as serverless functions, API gateway, container registry, Cloud Formation/CDK. Experience more »
operating and securing a DevOps and DevSecOps environment. PC s (Mac, Windows, Linux) Mobile devices (Android, iOS, Windows) Servers (Linux, Windows) Web servers (IIS, Apache, NGNIX) Databases (SQL Server, Oracle, Mongo DB, Postgres, Reddis) Datawarehouse (Redshift, Snowflake) Network devices (Firewalls, Proxy, NIPS, others) 8+ years of experience in critical more »
as R, Python, Azure, Machine Learning (ML), and Databricks. Essential criteria and experience Proficiency in one or more analytical tools eg R, Python, Tableau, Apache Spark, etc. Proficiency in Azure Machine Learning and Azure Data Bricks. Pro-activity and self-starting attitude. Excellent analytical and problem-solving ability. Interest more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . #J-18808-Ljbffr more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Our Commitment to Diversity and Inclusion At Databricks more »
is not essential, but it would make things easier. The main technologies we use include: Elixir, Erlang, Python, Terraform, Ansible, Packer, EMR/Spark, Apache Spark, Apache Druid, BigQuery and Redis Familiarity with cloud technologies. Ideally AWS and technologies such as EC2, ECS, EMR, AWS Lambda, DynamoDB, S3 more »
machine learning or more general statistical analysis Strong software development skills with proficiency in Python or C++ Experience with analytics frameworks such as Pandas, Apache Spark, Dask, or Flink Experience with machine learning frameworks such as TensorFlow, JAX, PyTorch, Spark MLlib, Keras, or scikit-learn Experience in cloud-based … infrastructures such as AWS or GCP Exposure to orchestration platforms such as Apache Airflow or Kubeflow Proven attention to detail, critical thinking, and the ability to work independently within a cross-functional team What benefits do we offer ? Cond Nast Learning Hub where you ll find you ll find more »
development (ideally AWS) Knowledge and ideally hands-on experience with data streaming, event-based architectures and Kafka Strong communication and interpersonal skills Experience with Apache Spark or Apache Flink would be ideal, but not essential Please note, this role is unable to provide sponsorship. If this role sounds more »