cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business More ❯
application development, testing, and operational stability, especially with data pipelines. Proficiency in Python and data manipulation libraries such as NUMPY and PANDAS. Experience with PySpark, including analysis, pipeline building, tuning, and feature engineering. Knowledge of SQL and NoSQL databases, including joins, aggregations, and tuning. Experience with ETL processes and More ❯
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
ready for deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code Nice to Have More ❯
Future Talent Pool - GCP Data Engineer, London, hybrid role - digital Google Cloud transformation programme Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/Linux Platform Version control tools More ❯
Glasgow, Renfrewshire, United Kingdom Hybrid / WFH Options
Cisco Systems, Inc
field. Experienced with cloud based data processing platforms such as AWS, and/or Databricks. You have firm software development skills with Python/PySpark, Terraform, Git, CI/CD, Docker. Comfortable with relational and NoSQL databases/datastores such as Elasticsearch. Familiar with the threat landscape and threat More ❯
of data for critical client projects. How to develop and enhance your knowledge of agile ways of working and working in open source stack (PySpark/PySQL). Quality engineering professionals utilise Accenture delivery assets to plan and implement quality initiatives to ensure solution quality throughout delivery. As a … team members to provide regular progress updates and raise any risk/concerns/issues. Core skills we're working with include: Palantir PythonPySpark/PySQL AWS or GCP What's in it for you At Accenture in addition to a competitive basic salary, you will also have More ❯
of data for critical client projects. How to develop and enhance your knowledge of agile ways of working and working in open source stack (PySpark/PySQL). Quality engineering professionals utilise Accenture delivery assets to plan and implement quality initiatives to ensure solution quality throughout delivery. As a … team members to provide regular progress updates and raise any risk/concerns/issues. Core skills we're working with include: Palantir PythonPySpark/PySQL AWS or GCP What's in it for you At Accenture, in addition to a competitive basic salary, you will also have More ❯