of atomic concepts Compiler technologies such as building interpreters, compilers and DSLs Relational Databases and Massively Parallel Database systems Big Data Technologies such as ApacheSpark, Apache Arrow Software Development in a Commercial Environment Qualification & Skills: Development Tools and Methodologies Experience of TDD and BDD in a … Exposure to continuous build and deployment solutions such as Jenkins Able to work within an agile environment delivering software incrementally Nice to have: Clickhouse, ApacheSpark, Kafka, Postgres, OLAP Thank you for considering us more »
analysis. Your expertise will be instrumental in ensuring the security and efficiency of the data handling and reporting processes. Key Responsibilities: Data Processing: Utilize ApacheSpark, AWS RDS, and Hadoop to process large datasets efficiently and securely. Reporting: Generate comprehensive and insightful reports using Tableau. Business Rules Management … adherence to best practices and maintaining high-security standards. Requirements: Security Clearance: Must hold a current and valid Security Clearance. Technical Skills: Proficient with ApacheSpark, AWS RDS, and Hadoop. Experienced in using Tableau for data visualization and reporting. Familiarity with Red Hat Decision Manager for business rules more »
delivering moderate-to-complex data flows as part of a development team in collaboration with others. You’ll be confident using technologies such as: Apache Kafka, Apache NiFi, SAS DI Studio, or other data integration platforms. You can implement, deliver, and translate several data models, including unstructured data … and recognised standards to build solutions using various traditional or big data languages such as: SQL, PL/SQL, SAS Macro Language, Python, Scala, ApacheSpark, Java, JavaScript etc, using various tools including SAS, Hue (Hive/Impala), Kibana (Elastic Search). Knowledge of data management on Cloud more »
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as ApacheSpark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands more »
Penrith, Cumbria, United Kingdom Hybrid / WFH Options
Computer Futures
DataBricks, Azure SQL (Indicative experience = 5yrs+) Build and test processes supporting data extraction, data transformation, data structures, metadata, dependency and workload management. Knowledge on Spark architecture and modern Datawarehouse/Data-Lake/Lakehouse techniques Build transformation tables using SQL. Moderate level knowledge of Python/PySpark or equivalent more »
Greater London, England, United Kingdom Hybrid / WFH Options
Anson McCade
encompassing experience in both stream and batch processing. Proficiency in the design and deployment of production data pipelines, involving languages like Java, Python, Scala, Spark, and SQL. You should also have some, if not all, of the following; Capability in scripting, data extraction via APIs, and the composition of more »
encompassing experience in both stream and batch processing. Proficiency in the design and deployment of production data pipelines, involving languages like Java, Python, Scala, Spark, and SQL. You should also have some, if not all, of the following; Capability in scripting, data extraction via APIs, and the composition of more »
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
LSA Recruit
transformation, and visualization capabilities. - Strong programming skills in Python, SQL, and other relevant languages. - Experience with big data technologies and tools such as Hadoop, Spark, and Kafka. - Familiarity with cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes). *Soft Skills:* - Excellent problem-solving and analytical skills. - Strong more »
Bristol, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with ApacheSpark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to more »
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with ApacheSpark, Databricks or Hadoop Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to more »
as TensorFlow, PyTorch, or Scikit-learn. Strong knowledge of statistical modelling, data mining, and data visualization techniques. Experience with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, GCP, Azure). Strong problem-solving skills and the ability to think critically and creatively. Excellent analytical skills with more »
London, Liverpool, Merseyside, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
rate of £250-£400, falling inside IR35 regulations. Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes using AWS, Databricks, Python, Spark, and SQL. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Optimize and troubleshoot data … Glue). Hands-on experience with Databricks for data processing and analytics. Proficient in Python programming for data manipulation and automation. Solid understanding of ApacheSpark for big data processing. Strong SQL skills for data querying, transformation, and analysis. Excellent problem-solving abilities and attention to detail. Ability more »
Learn, TensorFlow, PyTorch). Solid understanding of ML and data pipeline architectures and best practices. Experience with big data technologies and distributed computing (e.g., Spark, Hadoop) is a plus. Proficient in SQL and experience with relational databases. Strong analytical and problem-solving skills, with a keen attention to detail. more »
or similar technologies. Hands-on experience with AWS and snowflake. Financial services industry experience (highly desirable). Experience with Big Data technologies such as Spark or Hadoop. Bachelor's degree in computer science, Engineering, or equivalent. Further information available upon application. ECS Recruitment Group Ltd is acting as an more »
London, England, United Kingdom Hybrid / WFH Options
Anson McCade
tools such as Informatica MDM, Informatica AXON, Informatica EDC, and Collibra • MySQL, SQL Server, Oracle, Snowflake, PostgreSQL and NoSQL databases • Programming languages such as Spark or Python • Amazon Web Services, Microsoft Azure or Google Cloud and distributed processing technologies such as Hadoop Benefits : • Base Salary more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
Bachelor's degree in Computer Science or a related field (Master's degree preferred) Nice to have: experience with LLMs, Vector Databases, AWS EMR, Spark, and Python Our commitment: Equal opportunities are important to us. We believe that diversity and inclusion at The Stepstone Group are critical to our more »
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
of OO programming, software design, i.e., SOLID principles, and testing practices. Knowledge and working experience of AGILE methodologies. Proficient with SQL. Familiarity with Databricks, Spark, geospatial data/modelling and insurance are a plus. Exposure to MLOps, model monitoring principles, CI/CD and associated tech, e.g., Docker, MLflow more »
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
/transformation of large data sets using varied databases and platforms and a knowledge and experience of Google Analytics as well as Python and Spark libraries would be advantageous. Please note this is a fixed term appointment to cover maternity leave. You will work closely with Architects and Tech more »
Employment Type: Contract, Part Time, Work From Home
Basingstoke, England, United Kingdom Hybrid / WFH Options
Intec Select
cross-functionally across the business to understand the requirements of the products Designing and implementing performance related data ingestion pipelines from multiple sources using ApacheSpark Integrating end-to-end data pipelines ensuring a high level of quality is maintained Working with an Agile delivery/DevOps methodology more »
Surrey, England, United Kingdom Hybrid / WFH Options
Hawksworth
working in the world of Data Science You're more than capable with SQL & Python You have exposure to big data technologies such as Spark Ideally you will have experience with statistical analysis, machine learning algorithms, and data mining techniques You have excellent communication skills and can communicate well more »
Greenwich, England, United Kingdom Hybrid / WFH Options
Source Recruit
products such as Data Factory, Event Hubs, Data Lake, Synapse, and Azure SQL Server. * Proficiency in developing with Databricks and coding with PySpark and Spark SQL. * Experience in delivery of data and analytics solutions/applications to automate business processes and meet user function and quality characteristics including performance more »
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
refining them to strong results. Exposure to Python data science stack Knowledge and working experience of AGILE methodologies. Proficient with SQL. Familiarity with Databricks, Spark and geospatial data/modelling are a plus. We’ll help you gain… Experience working in a high-performance environment where collaboration and business more »
Skills & Experience At least 10 years experience working with JavaScript or Python/Java Previous experience deploying Software into the Cloud EKS, Docker, Kubernetes ApacheSpark or NiFi Microservice architecture experience Experience with AI/ML systems more »
they are on the lookout for 2 AWS Data Engineers to come in on a contract basis. Key Skills/Requirements: Must have Python & Spark experience Must have strong AWS experience Must have Terraform experience SQL & NoSQL experience Have built out Data Warehouses & built Data Pipelines Strong Databricks & Snowflake more »
City of London, London, United Kingdom Hybrid / WFH Options
Oliver Bernard Ltd
Experience with a JVM language, Kotlin, Java, Scala, Clojure Knowledge of Typescript and React is beneficial Exposure to data pipelines using technologies such as Spark and Kafka Experience with cloud services (ideally AWS) Hybrid working 1-2 days per week in Central London. £110,000 depending on experience. Please more »