problem-solving and communication skills Proficiency in scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with Apache Airflow for managing complex data processing work-flows. Solid understanding of software development best practices, including version control (Git), testing, and code review processes. more »
LIN Buses Serial Buses (RS485/RS232 etc..) SPI/I2C Python Go XML JSON HTML CSS Web backend servers (Angular, Django, NodeJS, React, Apache or similar) Web Sockets IP video and video routing Familiarity with Systems serving Real Time Information via Web Sockets Use of DDS and interfacing more »
Azure Data Lake , Azure Databricks or GCP Cloud Dataproc . Familiarity with big data technologies and distributed computing frameworks, such as Hadoop, Spark, or Apache Flink. Experience scaling an “API-Ecosystem ”, designing, and implementing “API-First” integration patterns. Experience working with authentication and authorisation protocols/patterns. Other Information more »
structures. Experience of API (REST) development, Docker, and Kubernetes. Familiarity with IntelliJ, Subversion and Maven. Exposure to one or more of the following technologies: Apache Storm, OpenSearch, Cassandra and Kafka. Ability to work within a hybrid Agile methodology. Understand the design and development approaches required to build a scalable more »
is data analysis, building enterprise data platform within asset management or Financial Services is required Expert level Java/Python Development skills AWS expert - Apache Iceberg/Spark and Airflow Please apply for immediate consideration more »
offs explicit and understandable to others REQUIREMENTS 7+ years' coding experience, including 3 years in a dedicated ML Engineering role 2+ years’ experience with Apache Spark Experience working with GB+ scale data Experience with deployed ML services Experience deploying multiple ML projects across different environments Productionisation experience in at more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create more »
and team working skills Nice to haves Degree in Computer Science or similar Experience with No-SQL databases; Mongo, Cassandra, Redis Real time streaming; Apache Storm/Kafka Streams Infrastructure knowledge; ansible, puppet, AWS, Kubernetes more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.Our Commitment to Diversity and InclusionAt Databricks, we are committed more »
the Midlands. Ideal Candidate Profile: We are seeking an individual who have the following attributes: Proven expertise as a Data Engineer, demonstrating proficiency in Apache Spark and cloud-based technologies, particularly Microsoft Azure and Databricks. Strong programming skills, with a focus on Python, along with proficiency in ETL frameworks more »
explain and present the findings of technical work to non-expert audiences Fluency with Python machine learning and data science packages (pandas, scikit-learn, Apache, Spark, DASK, Tensorflow, etc.) or experience with programming languages and willingness to learn Python For engineering, experience in a DevOps role, ideally in a more »
DBs Assisting in the development of high performing teams Demonstratable problem solving and ownership skills Nice to have: Understanding/knowledge/exposure to Apache, Tomcat, Container tools, SSO technologies, and monitoring tools but certainly not critical to the functionality of this position. The above is a wish list more »
Monitoring, Tuning, Housekeeping Experience working in an ITIL environment (desired) UNIX, Linux, AIX, Solaris (desired) Nice to have: Understanding/knowledge/exposure to Apache, Tomcat, Container tools, SSO technologies, and monitoring tools but certainly not critical to the functionality of this position. The above is a wish list more »
step functions and ECS services. Strong understanding of AWS ecosystems like Lambdas, step functions and ECS services. Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
Greater London, England, United Kingdom Hybrid / WFH Options
CommuniTech Recruitment Group
AWS ecosystems like Lambdas, step functions and ECS services. Experience of Dremio is a nice to have Experience with data stack technologies, such as Apache Iceberg & Spark. Exposure to Apache Airflow, Prefect, Dagster, DBT. Expertise in data analysis with exposure to data services (such as Glue, Lake Formation more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.Our Commitment to Diversity and InclusionAt Databricks, we are committed more »
Rickmansworth, Hertfordshire, South East, United Kingdom
Mobilize Financial Services
build, operate and manage a complex production environment. Familiarity with RedHat based Linux versions Experience of Web Application servers architectures, security, protocols and technologies (Apache Web Server, HAProxy, Tomcat) configuration and optimization Understanding of DR/BCP business processes Comfortable liaising with business users as well as technical teams more »
developing and optimising ETL pipelines. Version Control: Experience with Git for code collaboration and change tracking. Data Pipeline Tools: Proficiency with tools such as Apache Airflow. Cloud Platforms: Familiarity with AWS, Azure, Snowflake, and GCP. Visualisation: Tableau or PowerBI Delivery Tools: Familiarity with agile backlogs, code repositories, automated builds more »
proficiency in SQL for data querying and transformation. ● Programming skills in Python, including experience with basic libraries like os, csv, and pandas. ● Experience with Apache Airflow for workflow management. ● Experience with enterprise DBMS (e.g., DB2, MS SQL Server) and cloud data warehouses, particularly Google BigQuery. ● Proficiency in Google Cloud more »
management and data governance open source platform that we will teach you. Other technologies in use in our space: RESTful services, Maven/Gradle, Apache Spark, BigData, HTML 5, AngularJs/ReactJs, IntelliJ, Gitlab, Jira. Cloud Technologies: You’ll be involved in building the next generation of finance systems more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday and more »
Basingstoke, England, United Kingdom Hybrid / WFH Options
Intec Select
cross-functionally across the business to understand the requirements of the products Designing and implementing performance related data ingestion pipelines from multiple sources using Apache Spark Integrating end-to-end data pipelines ensuring a high level of quality is maintained Working with an Agile delivery/DevOps methodology to more »
in Computer Science, Software Engineering, or a related field.Proven experience as a Senior Software Developer, with a strong background in LAMP stack applications (Linux, Apache, MySQL, PHP).Proficiency in front-end technologies such as HTML, CSS, JavaScript, and modern frameworks like React or Angular.Strong experience with database design and more »