EC2, EMR, RDS, Redshift (or Azure equivalents) Data streaming systems: Storm, Spark-Streaming, etc. Search tools: Solr, Lucene, Elasticsearch Object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. Advanced working SQL knowledge and experience working with relational databases, query authoring and optimization (SQL) as well as working familiarity with a variety of databases. Experience with message queuing More ❯
data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
london (city of london), south east england, united kingdom
Mastek
data pipelines for performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively More ❯
Washington, Washington DC, United States Hybrid / WFH Options
Titania Solutions Group
Confluent Kafka (Confluent Platform, Kafka Streams, kSQL, Schema Registry, etc.). Proficiency in Qlik Sense (data modeling, scripting, and dashboard creation). Strong programming skills in Python, Java, or Scala for data processing. Expertise in SQL and experience with relational and NoSQL databases. Familiarity with cloud platforms (e.g., AWS or Azure) and related services for data engineering. Experience with data More ❯
deployment in secure and scalable environments to include AI/ML frameworks such as TensorFlow, PyTorch, or scikit-learn Proven expertise in programming languages such as Python, Java, or Scala, with demonstrated experience in software engineering practices (e.g., version control, CI/CD pipelines, containerization) Experience building and optimizing data pipelines, ETL processes, and real-time streaming solutions using tools More ❯
on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience More ❯
into usable formats, and support project team to scale, monitor, and operate data platforms TS/SCI clearance Bachelor's degree Nice If You Have: Experience with Python, SQL, Scala, or Java Experience with UNIX or Linux, including basic commands and Shell scripting Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed More ❯
into usable formats and support project team to scale, monitor, and operate data platforms TS/SCI clearance Bachelor's degree Nice If You Have: Experience with Python, SQL, Scala, or Java Experience with UNIX or Linux, including basic commands and Shell scripting Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed More ❯
and support project team to scale, monitor, and operate data platforms TS/SCI clearance Bachelor's degree Nice If You Have: Experience in application development utilizing SQL or Scala Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed data or computing tools such as Spark, Databricks, Hadoop, Hive, AWS EMR, or More ❯
leading projects and deliverables within a collaborative, cross-functional team environment TS/SCI clearance Bachelor's degree Nice If You Have: 10+ years of experience using Python, SQL, Scala, or Java 5+ years of experience with a public cloud, including AWS, Microsoft Azure or Google Cloud 5+ years of experience with Distributed data or computing tools, including Spark, Databricks More ❯
of data into usable formats and support project team to scale, monitor, and operate data platforms Secret clearance Bachelor's degree Nice If You Have: Experience with Python, SQL, Scala, or Java Experience with UNIX or Linux, including basic commands and Shell scripting Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed More ❯
monitor, and operate data platforms Secret clearance Bachelor's degree in a Computer Science, Analytics, or Mathematics field Nice If You Have: Experience in application development utilizing SQL or Scala Experience with a public cloud, including AWS, Micro sof t Azure, or Google Cloud Experience with distributed data or computing tools such as Spark, Databricks, Hadoop, Hive, AWS EMR, or More ❯
on experience with cloud platforms like AWS or Azure, including relevant services like S3, EMR, Glue, Data Factory, etc. Proficiency in SQL and one or more programming languages (Python, Scala, or Java) for data manipulation and transformation. Knowledge of data security and privacy best practices, including data access controls, encryption, and data masking techniques. Strong problem-solving and analytical skills More ❯
resolve flow issues, optimize performance, and implement errorhandling strategies. • Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: • Proficiency in Java and SQL. • Experience with C# and Scala is a plus. • Experience with ETL tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity More ❯
resolve flow issues, optimize performance, and implement errorhandling strategies. • Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: • Proficiency in Java and SQL. • Experience with C# and Scala is a plus. • Experience with ETL tools and big data platforms. • Knowledge of data modeling, replication, and query optimization. • Hands-on experience with SQL and NoSQL databases is desirable. • Familiarity More ❯
flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. Programming & Data Technologies: Proficiency in Java and SQL. Experience with C# and Scala is a plus. Experience with ETL tools and big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity More ❯
Reston, Virginia, United States Hybrid / WFH Options
ICF
for IaC. Emergency management domain knowledge a plus Advanced proficiency in data engineering and analytics using Python, Expert-level SQL skills for data manipulation and analysis and experience with Scala, preferred but not required (Python expertise can substitute) Proven experience breaking down complex ideas into manageable components Demonstrable experience developing rapid POCs and Prototypes History of staying current with evolving More ❯
and compliance needs. Document data flows and engineering processes for transparency and knowledge sharing. Skills & Experience Programming & Data Technologies Proficiency in Java and SQL . Experience with C# and Scala is a plus. Familiarity with ETL tools and big data platforms. Knowledge of data modelling, replication, and query optimization. Experience with SQL and NoSQL databases. Exposure to data warehousing tools More ❯
knowledge sharing across the team. What We're Looking For Strong hands-on experience across AWS Glue, Lambda, Step Functions, RDS, Redshift, and Boto3. Proficient in one of Python, Scala or Java, with strong experience in Big Data technologies such as: Spark, Hadoop etc. Practical knowledge of building Real Time event streaming pipelines (eg, Kafka, Spark Streaming, Kinesis). Proven More ❯
monitor, and operate data platforms Secret clearance Bachelor's degree Nice If You Have: Experience in workflow development with JavaScript or Groovy Experience in application development utilizing SQL or Scala Experience with a public cloud, including AWS, Microsoft Azure, or Google Cloud Experience with distributed data and computing tools such as Spark, Databricks, Hadoop, Hive, AWS EMR, or Kafka Experience More ❯
London, England, United Kingdom Hybrid / WFH Options
Client Server
a week with flexibility to work from home once a week. About you: You have experience as a Data Engineer, working on scalable systems You have Python, Java or Scala coding skills You have experience with Kafka for data streaming including Kafka Streams and KTables You have strong SQL skills (e.g. PostgreSQL, MySQL) You have strong AWS knowledge, ideally including More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
a week with flexibility to work from home once a week. About you: You have experience as a Data Engineer, working on scalable systems You have Python, Java or Scala coding skills You have experience with Kafka for data streaming including Kafka Streams and KTables You have strong SQL skills (e.g. PostgreSQL, MySQL) You have strong AWS knowledge, ideally including More ❯