Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
data practices Possess strong knowledge of data tools, data management tools, and various data and information technologies. E.g. DAMA DMBOK, Microsoft SQL Server, Couchbase, Apache Druid, Spark, Kafka, Airflow, etc In-depth understanding of modern data principles, methodologies, and tools Excellent communication and collaboration skills, with the ability to … native computing concepts and experience working with hybrid or private cloud platforms is a plus. Demonstrable technical experience working with a Microsoft, Redhat, and Apache data and software engineering environment. A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
Rickmansworth, Hertfordshire, South East, United Kingdom
Mobilize Financial Services
build, operate and manage a complex production environment. Familiarity with RedHat based Linux versions Experience of Web Application servers architectures, security, protocols and technologies (Apache Web Server, HAProxy, Tomcat) configuration and optimization Understanding of DR/BCP business processes Comfortable liaising with business users as well as technical teams more »
Nottingham, Nottinghamshire, East Midlands, United Kingdom
Microlise
computing concepts and experience working with hybrid or private cloud platforms is a plus Demonstrable technical experience working with a Microsoft, Red Hat, and Apache data and software engineering environment A team-oriented individual with a passion for engineered excellence and the ability to lead and motivate a team more »
have a valid visa as we Are not able to sponsor Technical Stack:- Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow Skills years of experience in python scripting. in developing applications in Python language. to python-oriented Algorithm’s libraries such as NumPy, pandas more »
Knutsford, England, United Kingdom Hybrid / WFH Options
Capgemini
Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data modelling Principles and best practices. more »
Northamptonshire, England, United Kingdom Hybrid / WFH Options
Capgemini
Glue, Starburst, Snowflake, Redshift, Starburst, Hybrid(on-prem to cloud) for building data pipelines and ETL processes. Experience with data pipeline orchestration tools like Apache Airflow . Knowledge of SQL and ability to write complex queries for data transformation and analysis. Understanding of data modelling Principles and best practices. more »
Greater London, England, United Kingdom Hybrid / WFH Options
Validis
Proven ability to leverage CI/CD tools to streamline data pipeline development and deployment. Experience designing and implementing ETL pipelines using tools like Apache Airflow, Luigi, Spark, or similar frameworks (familiarity is a plus). Understanding of data warehousing concepts and data modelling techniques. Experience with SQL and more »
with impressive visualization (Power BI) · Experience in building large scale DW/BI systems for B2B SAAS companies · Experience with open-source tools like Apache Flink and AWS tools like S3, Redshift, EMR and RDS. · Experience with AI/Machine Learning and Predictive Analytics · Experience in developing global products more »
Tech: - AWS (S3, Glue, EMR, Athena, Lambda) - Snowflake, Redshift - DBT (Data Build Tool) - Programming: Python, Scala, Spark, PySpark or Ab Initio - Data pipeline orchestration (Apache Airflow) - Knowledge of SQL This is a 6 month initial contract with a trusted client of ours. CVs are being presented on Friday and more »
in Computer Science, Software Engineering, or a related field.Proven experience as a Senior Software Developer, with a strong background in LAMP stack applications (Linux, Apache, MySQL, PHP).Proficiency in front-end technologies such as HTML, CSS, JavaScript, and modern frameworks like React or Angular.Strong experience with database design and more »
with Git for version control and project management, alongside some knowledge of Linux/Shell. data platform familiarity - previous experience of working with both Apache Spark and MapReduce data processing and analytics frameworks. and reporting expertise - experience with Tableau, Power BI, Excel alongside notebooks for experiment documentation. What experience more »
Especially MS Azure is recommended as Microsoft Fabric is integrated within Azure services. Experience of designing robust , secure and compliant capabilities. Strong understanding of Apache Spark, Including its Architecture , Components, and how to create, Monitor, Optimize, and Scale Spark Jobs. Experienced working in a DevOps/Agile Team Experience more »
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
ARM
the internal needs of Arm with those of the community. Experience building or using industry standard programmer's interfaces. Expert understanding of Tensorflow, PyTorch, Apache TVM, or comparable inference frameworks. Expert understanding of AI training, quantization and visualization tools. Proven experiences with LINUX. Comfortable presenting technical topics to both more »
of ticketing systems Knowledge of Newrelic, Dynatrace, Datadog, or equivalent Knowledge of HTML, CSS, JS, PHP, Java, MySQL Experience maintaining web and application servers (Apache, NGINX, PHP, Redis, Tomcat, Varnish, etc.) Enthusiastic and results driven Punctual and organised Good problem solver Why BORN? We are an award-winning global more »
delivering moderate-to-complex data flows as part of a development team in collaboration with others. You’ll be confident using technologies such as: Apache Kafka, Apache NiFi, SAS DI Studio, or other data integration platforms. You can implement, deliver, and translate several data models, including unstructured data … and recognised standards to build solutions using various traditional or big data languages such as: SQL, PL/SQL, SAS Macro Language, Python, Scala, Apache Spark, Java, JavaScript etc, using various tools including SAS, Hue (Hive/Impala), Kibana (Elastic Search). Knowledge of data management on Cloud platforms more »
data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook.Our Commitment to Diversity and InclusionAt Databricks, we are committed more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL). … Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. Strong more »
ETL/ELT tools.Experience with NoSQL type environments, Data Lakes, Lake-Houses (Cassandra, MongoDB or Neptune).Experience with distributed storage, processing engines such as Apache Hadoop and Apache Spark.Experience with message brokering/stream processing services such as Apache Kafka, Confluent, Azure Stream Analytics.Experience in Test Driven more »
Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as Apache Spark, Google PubSub etc. Optimising data storage and retrieval systems for maximum performance using both relational an d NoSQL databases. Continuously monitoring and improving … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as Apache Spark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and data more »
Manchester Area, United Kingdom Hybrid / WFH Options
The Green Recruitment Company
Support colleagues in relation to the delivery of ESG built environment solutions Exhibit thorough expertise in IES-VE, including modules like VE Compliance, Radiance, Apache HVAC, Apache Systems, MacroFlow, MicroFlow, and Vista-Pro, and the ability to extract sustainability outputs (e.g., for BREEAM, LEED) from IES-VE Your more »
work is largely down to you. It can be entirely Back End. Otherwise, the stack includes Redux Saga, Ag-Grid, Node, TypeScript, gRPC, protobuf, Apache Ignite, Apache Airflow and AWS. As the application suite grows and advances in complexity, there is a decent amount of interaction with the more »