more details of the position - Ideal Qualifications Must Have - Platform engineer, Azure DevOps and CI/CD tools, Azure Cloud, Microsoft Fabric, Azure Services, ApacheSpark, Experience of using IAC (terraform, APIs), Data Engineer, Big Data, PySpark Solid understanding of data Engineering concepts & experience of building and maintaining … DevOps/Agile Experience of managing environments using IAC (Terraform API's) Experience of designing robust, secured and compliant platform Capabilities. Strong understanding of ApacheSpark including its architecture, components & how to create, monitor, optimize & scale spark jobs. Please send your resumes to adithya.thakur@s3staff.com for immediate more »
end ownership • Python or similar (Ruby or Node) or another Functional Language • JavaScript and associated frameworks, preferably Vue, or similar • Cloud technologies • SQL (advantageous) • Spark (advantageous) • Docker/Kubernetes – advantageous ) • MongoDB, SQL, Postgres & Snowflake (advantageous) • Developing online, cloud based SaaS products. • Leading and building scalable architectures and distributed systems more »
London, England, United Kingdom Hybrid / WFH Options
Harnham
DevOps experience in CI/CD. Experience using Tensorflow, Kubernetes, MLFlow, Kafka, and Airflow. Experience using Python is a must (tools like AWS and Spark are beneficial) Excellent communication skills and team and colleague engagement. A keen interest in problem-solving and using scalable machine learning to solve the more »
ownership • Python, Java or similar (Ruby or Node) or another Functional Language • JavaScript and associated frameworks, preferably Vue, or similar • Cloud technologies • SQL (advantageous) • Spark (advantageous) • Docker/Kubernetes – advantageous ) • MongoDB, SQL, Postgres & Snowflake (advantageous) • Developing online, cloud based SaaS products. • Leading and building scalable architectures and distributed systems more »
Exciting Opportunity: Python/Spark Big Data Software Engineer - Financial Services Sector Location: Glasgow Contract: 6 Months (Inside IR35) Rate: Up to £600 per Day Are you ready to take your software engineering career to the next level? We invite you to join a leading team in the financial … services industry where innovation meets top-tier excellence. We are on the lookout for a seasoned Python/Spark Big Data Software Engineer to play a key role in our agile team. What You Will Do: Devise and implement cutting-edge software solutions, venturing beyond traditional methods to tackle … languages. Comprehensive understanding of the Software Development Life Cycle and Agile methodologies. Familiarity with Python or PySpark and cloud technologies like AWS, Kubernetes, and Spark is highly desirable. Ideal Candidate: Someone with a knack for innovative solutions and a commitment to technical excellence. Previous experience in the financial services more »
on Kubernetes with Helm/Terraform Good to have prior experience dealing with streaming and batch compute frameworks like Spring Kafka, Kafka Streams, Flink, Spark Streaming, Spark Experience with large-scale computing platforms, such as Hadoop, Hive, Spark, NoSQL stores Experience with developing large-scale data pipelines more »
/CD, Applicant Resiliency, and Security Preferred qualifications, capabilities, and skills: · Skilled with Python or PySpark · Exposure to cloud technologies (Airflow, Astronomer, Kubernetes, AWS, Spark, Kafka) · Experience with Big Data solutions or Relational DB. · Experience in Financial Service Industry is a bonus. more »
Birmingham, England, United Kingdom Hybrid / WFH Options
⭕️ Nimbus®
as Python, C#, .Net and/or JavaScript is highly desirable. Experience with cloud platforms (e.g., Azure) and data technologies (e.g., SQL, NoSQL, Hadoop, Spark). PLEASE NOTE: You must have either UK citizenship or permanent leave to remain in the UK. Due to the high volume of applications more »
designing and building robust, scalable, distributed data systems and pipelines, using open source and public cloud technologies. Strong experience with data orchestration tools: e.g. Apache Airflow, Dagster. Experience with big data storage and processing technologies: e.g. DBT, Spark, SQL, Athena/Trino, Redshift, Snowflake, RDBMSs (PostgreSQL/MySQL … . Knowledge of event-driven architectures and streaming technologies: e.g. Apache Kafka, Kafka Streams, Apache Flink. Experience with public cloud environments: e.g. AWS, GCP, Azure, Terraform. Strong knowledge of software engineering practices: e.g. testing, CI/CD (Jenkins, Github Actions), agile development, git/version control, containers etc. more »
SQL Server, Sybase, Snowflake) Document databases (e.g. Mongo, ArangoDB, Couchbase, Solr) Big Data (e.g. Hadoop ecosystem, Bigtable) Data streaming (e.g. Kafka, Flink, Pulsar, Beam, Spark) Cloud databases (e.g. Snowflake, CockroachDB) Other database genres (e.g. Graph, Columnar, time series) In return, we’ll give you… A competitive basic salary … scheme A high spec laptop (of course!) Need more reasons? Here's a few more... Work with some of the most exciting new technologies Spark off co-workers who’ll challenge your thinking and help you to achieve your potential Deal openly and honestly with customers Benefit from a more »
Cheltenham, England, United Kingdom Hybrid / WFH Options
Yolk Recruitment Ltd
software solutions. Skills Required: In depth experience designing & building backend applications in Python. Familiarity with Big Data and Machine Learning technologies (NumPy, PyTorch, TensorFlow, Spark). Experience developing in a highly Agile/Scrum environment. Familiarity with CICD, containerisation, deployment technologies & cloud platforms (Jenkins, Kubernetes, Docker, AWS). Benefits more »
with ETL processes and tools. =Knowledge of cloud platforms (e.g., GCP, AWS, Azure) and their data services. =Familiarity with big data technologies (e.g., Hadoop, Spark) is a plus. =Understanding of AI tools like Gemini and ChatGPT is also a plus. =Excellent problem-solving and communication skills. =Ability to work more »
Basingstoke, England, United Kingdom Hybrid / WFH Options
Intec Select
cross-functionally across the business to understand the requirements of the products Designing and implementing performance related data ingestion pipelines from multiple sources using ApacheSpark Integrating end-to-end data pipelines ensuring a high level of quality is maintained Working with an Agile delivery/DevOps methodology more »
London, England, United Kingdom Hybrid / WFH Options
Harnham
DevOps experience in CI/CD. Experience using Tensorflow, Kubernetes, MLFlow, Kafka, and Airflow. Experience using Python is a must (tools like AWS and Spark are beneficial) Excellent communication skills and team and colleague engagement. A keen interest in problem-solving and using scalable machine learning to solve the more »
space of data and AI technologies and business scenarios. Strong understanding of cutting edge and legacy Big Data and AI technologies such as Hadoop, Spark, OpenAI and Claude as well as architectures and domains such as Computer Vision, NLP, Neural Networks, Machine Learning, Generative AI, Data Warehouse and LakeHouse more »
delivering moderate-to-complex data flows as part of a development team in collaboration with others. You’ll be confident using technologies such as: Apache Kafka, Apache NiFi, SAS DI Studio, or other data integration platforms. You can implement, deliver, and translate several data models, including unstructured data … and recognised standards to build solutions using various traditional or big data languages such as: SQL, PL/SQL, SAS Macro Language, Python, Scala, ApacheSpark, Java, JavaScript etc, using various tools including SAS, Hue (Hive/Impala), Kibana (Elastic Search). Knowledge of data management on Cloud more »
with Git for version control and project management, alongside some knowledge of Linux/Shell. data platform familiarity - previous experience of working with both ApacheSpark and MapReduce data processing and analytics frameworks. and reporting expertise - experience with Tableau, Power BI, Excel alongside notebooks for experiment documentation. What more »
analysis. Your expertise will be instrumental in ensuring the security and efficiency of the data handling and reporting processes. Key Responsibilities: Data Processing: Utilize ApacheSpark, AWS RDS, and Hadoop to process large datasets efficiently and securely. Reporting: Generate comprehensive and insightful reports using Tableau. Business Rules Management … adherence to best practices and maintaining high-security standards. Requirements: Security Clearance: Must hold a current and valid Security Clearance. Technical Skills: Proficient with ApacheSpark, AWS RDS, and Hadoop. Experienced in using Tableau for data visualization and reporting. Familiarity with Red Hat Decision Manager for business rules more »
London, England, United Kingdom Hybrid / WFH Options
Morgan McKinley
NumPy, scikit-learn). Understanding of database technologies (ETL) and SQL proficiency for data manipulation, data mining and querying. Knowledge of Big Data Tools (Spark or Hadoop a plus). Power BI, Dashboard design/development. Regulatory Awareness/Compliance Uphold Regulatory/Compliance requirements relevant to your role more »
DW/BI systems · Demonstrated ability in data modeling, ETL development, and Data warehousing. · Strong experience with Big Data Technologies (Hadoop, Hive, Hbase, Pig, Spark, etc.) · Expertise in a BI solution like Power BI · Hands on experience in modelling databases (particularly nosql), working on indexes, materialized views, performance tuning … with impressive visualization (Power BI) · Experience in building large scale DW/BI systems for B2B SAAS companies · Experience with open-source tools like Apache Flink and AWS tools like S3, Redshift, EMR and RDS. · Experience with AI/Machine Learning and Predictive Analytics · Experience in developing global products more »
ideally in a start-up or scale-up. - Machine learning libraries and frameworks (TensorFlow, PyTorch, scikit-learn). - Python - Big data processing tools (e.g., Spark). The role offers a salary range of between £70-100K depending on experience. The successful candidate must be able to work from more »
Especially MS Azure is recommended as Microsoft Fabric is integrated within Azure services. Experience of designing robust , secure and compliant capabilities. Strong understanding of ApacheSpark, Including its Architecture , Components, and how to create, Monitor, Optimize, and Scale Spark Jobs. Experienced working in a DevOps/Agile more »
working closely with our product teams on existing projects and new innovations to support company growth and profitability. Our Tech Stack Python Scala Kotlin Spark Google PubSub Elasticsearch Bigquery, PostgresQL Kubernetes, Docker, Airflow Ke y Responsibilities Designing and implementing scalable data pipelines using tools such as ApacheSpark … Data Infrastructure projects, as well as designing and building data intensive applications and services. Experience with data processing and distributed computing frameworks such as ApacheSpark Expert knowledge in one or more of the following languages - Python, Scala, Java, Kotlin Deep knowledge of data modelling, data access, and more »
role Good level of experience of Data Lake/Hadoop platform implementation Good level hands-on experience in implementation and performance tuning Hadoop/Spark implementations Experience Apache Hadoop and the Hadoop ecosystem Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, Hcatalog, Solr … Avro) Experience with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto) Experience developing software code in one or more programming languages (Java, Python, etc.) Preferred Qualifications: Masters or PhD in Computer Science, Physics, Engineering or Maths Hands on experience leading large-scale global data warehousing more »
Bricks setup using Terraform experience. * Experience of MLOps and DataOps. * Experience of using container technologies, cloud platforms (ideally AWS), and distributed processing frameworks like Spark and Dask. * Experience in Javascript application development and UI design. * Expertise in developing mobile applications. * Familiarity with the agile software development process. If you more »