Grand Prairie, Texas, United States Hybrid / WFH Options
Jobot
data warehouse solutions (e.g., Snowflake, Fabric). o Expertise in Python for data engineering tasks, including data manipulation (Pandas, NumPy) and workflow management (Dask, PySpark, FastAPI). o Solid knowledge of cloud platforms (Azure, AWS) and big data technologies (Hadoop, Spark). o Hands-on experience with Docker, Kubernetes More ❯
years of experience in Data Engineering , with a focus on cloud platforms (AWS, Azure, GCP); You have a proven track record working with Databricks (PySpark, SQL, Delta Lake, Unity Catalog); You have extensive experience in ETL/ELT development and data pipeline orchestration (e.g., Databricks Workflows, DLT, Airflow, ADF More ❯
Wakefield, Yorkshire, United Kingdom Hybrid / WFH Options
Flippa.com
/CD) automation, rigorous code reviews, documentation as communication. Preferred Qualifications Familiar with data manipulation and experience with Python libraries like Flask, FastAPI, Pandas, PySpark, PyTorch, to name a few. Proficiency in statistics and/or machine learning libraries like NumPy, matplotlib, seaborn, scikit-learn, etc. Experience in building More ❯
storage, data pipelines to ingest and transform data, and querying & reporting of analytical data. You've worked with technologies such as Python, Spark, SQL, Pyspark, PowerBI etc. You're a problem-solver, pragmatically exploring options and finding effective solutions. An understanding of how to design and build well-structured More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
various departments to gather requirements and ensure data solutions reflect real business needs. Key Experience Required: Deep expertise in SQL, Python, and Spark (particularly PySpark) for building and testing end-to-end pipelines in that process both structured and semi-structured datasets. Experience mentoring peers and supporting team growth More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
experience as a Senior Data Engineer, with some experience mentoring others Excellent Python and SQL skills, with hands-on experience building pipelines in Spark (PySpark preferred) Experience with cloud platforms (AWS/Azure) Solid understanding of data architecture, modelling, and ETL/ELT pipelines Experience using tools like Databricks More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
experience as a Senior Data Engineer, with some experience mentoring others Excellent Python and SQL skills, with hands-on experience building pipelines in Spark (PySpark preferred) Experience with cloud platforms (AWS/Azure) Solid understanding of data architecture, modelling, and ETL/ELT pipelines Experience using tools like Databricks More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
Knowledge, Skills and Experience: Essentia l Strong expertise in Databricks and Apache Spark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and Delta Lake optimisation. Experience with ETL/ELT processes for integrating diverse data More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA More ❯
Chantilly, Virginia, United States Hybrid / WFH Options
The DarkStar Group
scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA More ❯
Herndon, Virginia, United States Hybrid / WFH Options
The DarkStar Group
scikit-learn, standard libraries, etc.), Python packages that wrap Machine Learning (packages for NLP, Object Detection, etc.), Linux, AWS/C2S, Apache NiFi, Spark, pySpark, Hadoop, Kafka, ElasticSearch, Solr, Kibana, neo4J, MariaDB, Postgres, Docker, Puppet, and many others. Work on this program takes place in Chantilly, VA, McLean, VA More ❯
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom Hybrid / WFH Options
Client Server
A A B grades at A-level You have commercial Data Engineering experience working with technologies such as SQL, Apache Spark and Python including PySpark and Pandas You have a good understanding of modern data engineering best practices Ideally you will also have experience with Azure and Data Bricks More ❯
Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
IAT Level II Certification may be required (GSEC, GICSP, CND, CySA+, Security+ CE, SSCP or CCNA-Security). Advanced proficiency in Python, SQL, PySpark, MLOps, computer vision, and NLP, Databricks, AWS, Data Connections Cluster Training and tools such as GitLab, with proven success in architecting complex AI/ML More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Peaple Talent
Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's in it for More ❯
Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's in it for More ❯
london, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using Big Data technologies such as Apache Spark or PySpark Great communication skills, effectively participating with Senior Stakeholders Nice to haves: Azure/AWS Data Engineering certifications Databricks certifications What's in it for More ❯
the work you deliver Furthermore, you have experience in: working with AWS developing applications in a Kubernetes environment developing batch jobs in Apache Spark (pyspark or Scala) and scheduling them in an Airflow environment developing streaming applications for Apache Kafka in Python or Scala working with CI/CD More ❯
/Experience: The ideal candidate will have the following: Proven experience leading data engineering projects, especially cloud migration initiatives. Hands-on experience with Python, PySpark, SQL, and Scala. Deep knowledge of modern data architecture, data modeling, and ELT/ETL practices. Strong grasp of data security, governance, and compliance More ❯
a bachelor or university degree or equivalent by experience. A post-graduate is an asset; Strong programming skills, particularly in languages such as Python (PySpark, Pandas), SQL, ; You have knowledge of big data principles (such as distributed processing); You have an understanding of data management principles, data architecture design More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Ikhoi Recruitment
the Canary Wharf office, 3 days WFH, pension, life assurance, season ticket loan and lot's more. You must have experience with Python/Pyspark, Azure and managing a small team of Mid level/Senior Data Engineers. As part of the Technology Hub within the client, Data Engineer More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
cloud-based architectures on AWS, Azure, or GCP A strong background in Databricks for building enterprise data warehouses with strong knowledge of SQL and Pyspark A deep understanding of data modelling and advanced pipeline development Knowledge and understanding of best practice CI/CD approaches utilising Git. Strategic Vision More ❯
Tampa, Florida, United States Hybrid / WFH Options
LTIMindtree
Python Unit Test libraries Good to have GenAI skillset Good knowledge and work experience in Unixcomm and Shellscripts etc Good to have experience on Pyspark Hadoop and Hive as well Expertise in software engineering principles such as design patterns code design testing and documentation Writing effective and scalable codes More ❯
Houston, Texas, United States Hybrid / WFH Options
Jobot
AWS cloud services to expand and support data infrastructure. 8. Leverage Databricks for big data processing and machine learning tasks. 9. Use Python and Pyspark for data analysis and data wrangling. 10. Collaborate with stakeholders to understand and meet their data requirements, while also ensuring data privacy and compliance. More ❯
Platform technologies (Synapse, Data Lakes, ADF) Expertise in data modelling, ETL/ELT pipeline development, and data integration Proficient in SQL and Python (ideally PySpark) Knowledge of tools such as Power BI, Microsoft Fabric, and DevOps (CI/CD pipelines) Experience working with enterprise data sources and APIs (e.g. More ❯
transformation logic, and build of all Power BI dashboards - including testing, optimization & integration to data sources Good exposure to all elements of data engineering (PySpark, Lakehouses, Kafka, etc.) Experience building reports from streaming data Strong understanding of CI/CD pipeline Financial and/or trading exposure, particularly energy More ❯