Airbyte for data ingestion, Prefect for pipeline orchestration, AWS Glue for managed ETL, along with Pandas and PySpark for pipeline logic implementation. We utilize DeltaLake and PostgreSQL for data storage, emphasizing the importance of data integrity and version control in our workflows. Day-to-day, you will more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
team and wider organisation using the tech you think is required! Skills desired/what you will learn: Microsoft Azure Azure SQL Microsoft Fabric DeltaLake, Databricks and Spark Statistical Modelling Azure ML Studio Python and familiarity with libraries and frameworks for data analysis and machine learning (e.g. more »
designing and developing reusable data pipelines tailored for my clients data platform. Exposure to cutting-edge technologies such as serverless solutions, micro service architecture, deltalake, and cloud-based applications will be part of your journey, along with maintaining Infrastructure as Code (IaC). You'll need more »
alternatively, give me a call on (phone number removed). Keywords: Azure Data Factory, Azure Databricks, Databricks Lakehouse, MS Power BI, Power BI, Spark, DeltaLake, T-SQL, DevOps, ETL, Data Modelling, DAX, Data Warehousing, London more »
Developer, Dynamics 365, D365, D365 F&O, D365 Consultant, AX 2012, Dynamics AX, Microsoft power BI, SSRS, ETL, Reporting developer, SQL, Azure synapse, Data Lake, Lake House, DeltaLake, Data verse, DAX, BI reports, Paginated Report Builder, South London, Hybrid, £60-£75K Our end user client … enterprise data warehouse/BI projects. 4+ years' experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake, Lake House and DeltaLake Experience Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics more »
Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing …/BI projects. A relevant number of years' experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Following Tools Experience : Microsoft Power BI Synapse Paginated Report Builder Power Platform KBR Company Information When you become part of the KBR team more »
with Data Warehousing, Business Intelligence, Data Science and Data Engineering needs to fulfil data & analytics solutions Architecture experience in the cloud (Azure Storage, Databricks, DeltaLake, Lakehouse, Azure AI, ML Ops) It would be great if you also have Appreciation of Data Security and governance Exposure of working more »
Proficiency in designing and implementing data models, including star schemas and snowflake schemas. Expertise in SQL. · Data Storage Technologies - Experience in SQL, NoSQL, Blob, DeltaLake, and/or other enterprise scale data stores. · Data Processing Frameworks - Proficiency in designing and building data pipelines for data processing and more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
data platform. Design, develop, test, and deploy data pipelines and integrations using IPaaS technology Collaborate on engineering platform ingestion, orchestration, data warehouse/data lake and API strategies for the data management ecosystem Willingness and an enthusiastic attitude to work within existing processes/methodologies. Collaborate with the DevOps … data processing - Expertise in processing enterprise-scale volumes of data, ideally with proficiency in Snowflake. Experience Data Stores - Technical excellence in SQL, NoSQL, Blob,DeltaLake, and other enterprise scale data stores. Data Orchestration - Enterprise scale usage of technology such as Azure Data Factory, Apache Airflow, Logic Apps more »
Birmingham, England, United Kingdom Hybrid / WFH Options
Lorien
Proficiency in designing and implementing data models, including star schemas and snowflake schemas. Expertise in SQL. • Data Storage Technologies - Experience in SQL, NoSQL, Blob, DeltaLake , and/or other enterprise scale data stores. • Data Processing Frameworks - Proficiency in designing and building data pipelines for data processing and more »
the project. There is a massive emphasis on Pyspark and Databricks for this particular role. Technical Skills Required: Azure (ADF, Functions, Blob Storage, Data Lake Storage, Azure Data Bricks) Databricks Spark DeltaLake SQL Python PySpark ADLS Day To Day Responsibilities: Extensive experience in designing, developing, and more »
and constructing robust data pipelines using the best of open-source data engineering and scientific Python toolset. Tech Stack: Airbyte AWS Glue Pandas Pyspark DeltaLake PostgreSQL The team follows agile ways of working and you engage with various stakeholders across the business. The role is Hybrid more »