Falls Church, Virginia, United States Hybrid / WFH Options
Epsilon Inc
abilities, and the capacity to work productively in cross-functional teams while maintaining a continuous improvement mindset. One or more of the following certifications are desired: AWS Certified Developer, Databricks, Agile/Scrum, Python Programmer Preferred Qualifications: Familiarity with Apache Spark or comparable distributed data processing frameworks, preferably for large-scale data transformations and analytics. Working knowledge of data governance … platforms (e.g., Collibra) and cloud-based analytics solutions (e.g., Databricks). Hands-on experience with modern BI tools (e.g., Qlik, Power BI, Tableau) for data visualization and insight generation. Awareness of data access control best practices (e.g., Immuta) and familiarity with role or attribute-based access control frameworks. Other: Must hold an Active DOD Secret, Top Secret or TS/ More ❯
Strong analytical skills and attention to detail./Bonus Skills: Experience with Microsoft Fabric (OneLake, DirectLake, Data Warehouse, Data Activator) is a significant plus. Knowledge of Azure Synapse Analytics , Databricks , or Power BI . Experience with automated data validation or scripting (e.g., Python, PowerShell). Familiarity with CI/CD processes in data environments. Relevant certifications (e.g., Microsoft Certified: Azure More ❯
for Azure services. Skills, Experience & Competencies Required Skills: Extensive experience in data engineering with Microsoft Azure. Proficiency in Azure services such as Azure Data Factory, Azure Data Lake, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database. Strong experience with ETL/ELT pipelines, data modelling, and data integration. Proficiency in SQL and programming languages like Python, Scala, or PowerShell. More ❯
practices. Collaborating with leadership and stakeholders to align data priorities. Qualifications and Experience: Expertise in Commercial/Procurement Analytics and SAP (S/4 Hana). Experience with Spark, Databricks, or similar tools. Strong proficiency in data modeling, SQL, NoSQL, and data warehousing. Hands-on experience with data pipelines, ETL, and big data technologies. Proficiency in cloud platforms like AWS More ❯
using both on-premises and cloud-based technologies (Azure, AWS, or GCP). You'll design scalable data architectures, including data lakes, lakehouses, and warehouses, leveraging tools such as Databricks, Snowflake, and Azure Synapse. The ideal candidate will have a deep technical background in data engineering and a passion for leading the development of best-in-class data solutions. You More ❯
/ELT, Data Lakes, Warehousing, MDM, and BI. Engineering delivery practices : Knowledge of Agile, DevOps, Git, APIs, Containers, Microservices, and Data Pipeline orchestration. Broader platform experience (desirable): Familiarity with Databricks, Snowflake, Azure Data Factory, Azure Synapse, Microsoft SQL/SSIS. Certifications that are a plus: Microsoft Certified: Fabric Analytics Engineer Associate (DP-600) Microsoft Certified: Fabric Data Engineer Associate (DP More ❯
Watford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Certification: DP-600 – Fabric Analytics Engineer Associate. Responsibilities Daily tasks include, but are not limited to: Designing, building, and optimizing data pipelines and ETL workflows using Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation. Writing clean, efficient Python code for data engineering. Collaborating with data scientists and stakeholders on machine learning and More ❯
SQL skills, including data transformation and profiling. Experience building scalable production ETL/ELT pipelines. 1-2 years of hands-on experience with Azure services such as Data Factory, Databricks, Synapse (DWH), Azure Functions, and other data analytics tools, including streaming. Experience with Airflow and Kubernetes. Programming skills in Python (PySpark) and scripting languages like Bash. Knowledge of Git, CI More ❯
Bournemouth, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient Python code for data engineering tasks. Collaborating with data scientists More ❯
Advanced SQL, data transformation and data profiling skills. Experience of building production ETL/ELT pipelines at scale. 1-2 years of hands on experience with Azure: Data factory, Databricks, Synapse (DWH), Azure Functions, App logic and other data analytics services, including streaming. Experience with Airflow and Kubernetes. Programming languages: Python (PySpark), scripting languages like Bash. Knowledge of Git, CI More ❯
Reading, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
drive technical innovation across client projects Document processes and contribute to internal knowledge repositories and best practice libraries Key Skills & Experience Strong hands-on experience with Azure tooling, including: Databricks, Data Factory, Data Lake, and Synapse (or similar data warehouse tools) Azure Analysis Services or comparable BI tooling Solid programming capability in: SQL, Python, Spark, and ideally DAX Familiarity with More ❯
Cheltenham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Hemel Hempstead, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Analytics Engineer Associate. Responsibilities: Daily tasks include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily role will include, but is not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation to ensure data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Derby, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineer Associate certification. Responsibilities Daily tasks include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation to ensure data availability and quality. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
a daily basis, your role will include, but is not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Stoke-on-Trent, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Coventry, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Chelmsford, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Chester, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Bath, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities: Your daily activities will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
York, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
certification. Responsibilities Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
drive technical innovation across client projects Document processes and contribute to internal knowledge repositories and best practice libraries Key Skills & Experience Strong hands-on experience with Azure tooling, including: Databricks, Data Factory, Data Lake, and Synapse (or similar data warehouse tools) Azure Analysis Services or comparable BI tooling Solid programming capability in: SQL, Python, Spark, and ideally DAX Familiarity with More ❯