operations Independently crafted and built scalable and reliable ETL, services, and pipelines, using Python/Flask/FastAPI/etc, Next.js, Kafka, Flink, Kubeflow, Spark, etc. Fluent in languages such as Python, Java, and Javascript. Hands-on experience with cloud computing platforms such as AWS Excellent communication, collaboration, and more »
and coding environments. Bonus Skills: Python/PHP/Typescript/ReactJS AI/ML models and usage ETL pipelines in AWS (Glue/ApacheSpark) API Load testing If you would like more information on the role or like to apply for then please send your CV more »
Modelling. Experience with at least one or more of these programming languages: Python, Scala/Java Experience with distributed data and computing tools, mainly ApacheSpark & Kafka Understanding of critical path approaches, how to iterate to build value, engaging with stakeholders actively at all stages. Able to deal more »
with Git for version control and project management, alongside some knowledge of Linux/Shell. data platform familiarity - previous experience of working with both ApacheSpark and MapReduce data processing and analytics frameworks. and reporting expertise - experience with Tableau, Power BI, Excel alongside notebooks for experiment documentation. What more »
manage several tasks/projects concurrently and prioritize work effectively. • Experience in Risk and Finance or Regulatory reporting. • Understanding of Big Data Technologies, Cloudera, Spark • Experience in CI/CD pipeline implementation • Good exposure in Python Scripting more »
Platforms Must have 8+ years' Experience with Relational Databases like Oracle, NoSQL Databases and/or Big Data technologies (e.g. Oracle, SQL Server, Postgres, Spark, Hadoop, other Open Source). Must have experience in Data Security Solutions (Identity and Access Management and Data Security Access Management) Must have 3+ more »
Platforms Must have 8+ years' Experience with Relational Databases like Oracle, NoSQL Databases and/or Big Data technologies (e.g. Oracle, SQL Server, Postgres, Spark, Hadoop, other Open Source). Must have experience in Data Security Solutions (Identity and Access Management and Data Security Access Management) Must have 3+ more »
Engineer, with expertise developing scalable data pipelines. Strong object oriented programming skills, particularly in Python . Experience with data lakes and data warehousing solutions ( Spark, Dataflow, BigQuery ). Knowledge of SQL and experience with relational databases, as well as NoSQL databases Familiarity with cloud services (preferably GCP ) and understanding more »
Engineer, with expertise developing scalable data pipelines. Strong object oriented programming skills, particularly in Python . Experience with data lakes and data warehousing solutions ( Spark, Dataflow, BigQuery ). Knowledge of SQL and experience with relational databases, as well as NoSQL databases Familiarity with cloud services (preferably GCP ) and understanding more »
Google Cloud Composer, AWS Step Functions, Azure Data Factory, UC4, Control-M). * Experience with data quality and validation. * Experience querying massive datasets using Spark, Presto, Hive, Impala, etc. * Experience in optimization of computer-vision applications. * Experience in building highly-scalable performant data pipelines * Experience with Data Modeling. Morgan more »
Engineering experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms more »
Engineering experience Demostrate In-depth knowledge of large-scale data platforms (Databricks, Snowflake) and cloud-native tools (Azure Synapse, RedShift) Experience of analytics technologies (Spark, Hadoop, Kafka) Have familiarity with Data Lakehouse architecture, SQL Server, DataOps, and data lineage concepts Demonstrate In-depth knowledge of large-scale data platforms more »
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
Bachelor's degree in Computer Science or a related field (Master's degree preferred) Nice to have: experience with LLMs, Vector Databases, AWS EMR, Spark, and Python Our commitment: Equal opportunities are important to us. We believe that diversity and inclusion at The Stepstone Group are critical to our more »
CI/CD/YAML/ARM/Terraform MSBI Traditional Stack (SQL, SSAS, SSIS, SSRS) Azure Automation/PowerShell Azure Streaming Analytics/Spark Streaming Azure Functions/C# .NET PowerApps Data Science Master Data Management/MDS WHY ADATIS? There’s a long list of reasons, from more »
Preferred Qualifications): Masters degree in analytical discipline Knowledge of Big Data and Data warehousing Experience with text analytics and machine learning Exposure to Python, Spark and Databricks Experience with Digital Clickstream or Supply Chain is a plus United Airlines is an equal opportunity employer. United Airlines recruits, employs, trains more »
reigate, south east england, United Kingdom Hybrid / WFH Options
esure Group
of OO programming, software design, i.e., SOLID principles, and testing practices. Knowledge and working experience of AGILE methodologies. Proficient with SQL. Familiarity with Databricks, Spark, geospatial data/modelling and insurance are a plus. Exposure to MLOps, model monitoring principles, CI/CD and associated tech, e.g., Docker, MLflow more »
of building and scaling DE teams An ability to work with partners across the organization and develop the right prioritization frameworks. Technical Skills - AWS, Spark, Airflow, SQL , databricks, Data Modeling Soft Skills - Presenting, Storytelling, Communicating Get to know us Zillow is reimagining real estate to make home a reality more »
model training, evaluation, and productionization. - Strong programming skills in Python, with proficiency in ML frameworks (e.g., TensorFlow, PyTorch) and data engineering tools (e.g., Kafka, Spark). - Expertise in cloud computing platforms (AWS, Azure) and containerization technologies (Docker) for scalable and reliable ML model deployment. - Solid understanding of data privacy more »
DevOps/Agile Experience of managing environments using IAC (Terraform API's) Experience of designing robust, secured and compliant platform Capabilities. Strong understanding of ApacheSpark including its architecture, components & how to create, monitor, optimize & scale spark jobs. more »
DevOps/Agile Experience of managing environments using IAC (Terraform API's) Experience of designing robust, secured and compliant platform Capabilities. Strong understanding of ApacheSpark including its architecture, components & how to create, monitor, optimize & scale spark jobs. The Package We offer a competitive salary and a more »
DevOps/Agile Experience of managing environments using IAC (Terraform API's) Experience of designing robust, secured and compliant platform Capabilities. Strong understanding of ApacheSpark including its architecture, components & how to create, monitor, optimize & scale spark jobs. The Package We offer a competitive salary and a more »
Especially MS Azure is recommended as Microsoft Fabric is integrated within Azure services. Experience of designing robust , secure and compliant capabilities. Strong understanding of ApacheSpark, Including its Architecture , Components, and how to create, Monitor, Optimize, and Scale Spark Jobs. Experienced working in a DevOps/Agile more »
Swansea, Wales, United Kingdom Hybrid / WFH Options
CPS Group (UK) Limited
my client will train you): Knowledge of Microsoft SQL Server and packaged BI tools (SSAS and SSIS). Docker, Kubernetes and cloud computing technologies. Apache Kafka and data streaming. Familiarity with ApacheSpark or similar data processing tools. Experience developing and maintaining CICD pipelines, particularly Azure DevOps more »
classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau more »