Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as ApacheSpark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands more »
london, south east england, United Kingdom Hybrid / WFH Options
Solirius Consulting
Flask, Tornado or Django, Docker Experience working with ETL pipelines is desirable e.g. Luigi, Airflow or Argo Experience with big data technologies, such as ApacheSpark, Hadoop, Kafka, etc. Data acquisition and development of data sets and improving data quality Preparing data for predictive and prescriptive modelling Hands more »
SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Programming more »
SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Programming more »
SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Programming more »
SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Programming more »
SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Programming more »
SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Programming more »
SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Programming more »
SageMaker, or Azure Machine Learning for model development and deployment. Data Analytics and Big Data Technologies: Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Programming more »
of Experience in SQL, Java or Python programming Experience in Data Pipeline and integration workflow management tools: Talend, Store Proc, Change Data Capture (CDC), Spark & Azure API Experience identifying operational issues and recommending fixes to resolve problems Knowledge in Azure cloud technologies like Data Flow, Data Bricks, Azure Synapse more »
of Experience in SQL, Java or Python programming Experience in Data Pipeline and integration workflow management tools: Talend, Store Proc, Change Data Capture (CDC), Spark & Azure API Experience identifying operational issues and recommending fixes to resolve problems Knowledge in Azure cloud technologies like Data Flow, Data Bricks, Azure Synapse more »
of Experience in SQL, Java or Python programming Experience in Data Pipeline and integration workflow management tools: Talend, Store Proc, Change Data Capture (CDC), Spark & Azure API Experience identifying operational issues and recommending fixes to resolve problems Knowledge in Azure cloud technologies like Data Flow, Data Bricks, Azure Synapse more »
Synapse, Azure Analysis Services and Power BI Analytics Experience in Data Pipeline and integration workflow management tools: Talend, Store Proc, Change Data Capture (CDC), Spark & Azure API's . click apply for full job details more »
Synapse, Azure Analysis Services and Power BI Analytics Experience in Data Pipeline and integration workflow management tools: Talend, Store Proc, Change Data Capture (CDC), Spark & Azure API's . click apply for full job details more »
Synapse, Azure Analysis Services and Power BI Analytics Experience in Data Pipeline and integration workflow management tools: Talend, Store Proc, Change Data Capture (CDC), Spark & Azure API's . click apply for full job details more »
at scale utilising the best breed of Cloud services and technologies. So, what tools and technologies will you be using? AWS Python Databricks/Spark Trino Airflow Docker CloudFormation/Terraform SQL/NoSQL We provide you with the opportunity to think freely and work creatively and right now … Other skills we are looking for you to demonstrate include: Experience of data storage technologies: Delta Lake, Iceberg, Hudi Sound knowledge and understanding of ApacheSpark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications Knowledge of Architecture best practices and patterns Competence more »
edinburgh, central scotland, United Kingdom Hybrid / WFH Options
Change Digital – Digital & Tech Recruitment
AWS Redshift, and Python Experience with ETL processes, data integration, and data warehousing. Strong SQL skills Experience with Big Data technologies such as Hadoop, Spark, and Kafka Familiarity with cloud platforms (AWS, Azure, Google Cloud) Working knowledge of data visualisation tools (PowerBI, Tableau, Qlik Sense) Additional skills: Client-facing more »
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Anson McCade
and product development, encompassing experience in both stream and batch processing. Designing and deploying production data pipelines, utilizing languages such as Java, Python, Scala, Spark, and SQL. In addition, you should have proficiency or familiarity with: Scripting and data extraction via APIs, along with composing SQL queries. Integrating data more »
platforms like Google AI Platform, AWS SageMaker, or Azure Machine Learning for model development and deployment. Proficient in big data technologies such as Hadoop, Spark, and Kafka for handling large datasets. Experience with data visualization tools like Tableau, Power BI, or Qlik for deriving actionable insights from data. Strong more »
DataBricks, Azure SQL (Indicative experience = 5yrs+) Build and test processes supporting data extraction, data transformation, data structures, metadata, dependency and workload management. Knowledge on Spark architecture and modern Datawarehouse/Data-Lake/Lakehouse techniques Build transformation tables using SQL. Moderate level knowledge of Python/PySpark or equivalent more »
and strong knowledge of best practices and solution patterns Hands on experience in Data Pipeline and integration tools like Talend, Change Data Capture (CDC), Spark & Azure APIs Coach and guide technical engineers for source data understanding, designing analytical solutions and critical operational support more »
pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using Apache Airflow Experience using Docker Experience using ApacheSpark Benefits: Salary £40-50K per annum dependant on skills and experience 25 Days more »
pipelines Know your way around Unix based operating system Experience working with any major cloud provider (AWS, GCP, Azure) Fluency in English Experience using Apache Airflow Experience using Docker Experience using ApacheSpark Benefits: Salary £40-50K per annum dependant on skills and experience 25 Days more »
approach • Integration with 3rd party sources, open-source APIs • Data movement in both batch and real-time change data capture • Leverage Big Data technologies (Spark) to ingest, clean & transform data • Apply Machine Learning on DWH to proactively figure out issues and generate insights • Create dashboards in Oracle Analytics Cloud more »