Beechwood Centre, 40 Lower Gravel Road, Bromley, BR2 8GP

Senior Data Engineer

Senior Data Engineer
Permanent role
Location – Reading
Hybrid Role

Responsibilities
· Engineering cloud-based data lake patterns to connect and leverage organisational & external data
· Capture, Ingest, Process and Store large volumes of data and activate for multiple data science use cases
· Enable higher quality data to be created & managed in our customers organisations and standardise on the definition of business-critical data
· Develop data products that are commercially viable and technically excellent through utilising the (primarily) cloud native patterns available to the market
· Help to diagnose and fix issues in complex technical environments, provide technical support and guidance through prototyping, testing, build, and launch of data products
· Help to understand customer requirements and work collaboratively with other architects and consultants to design data solutions that deliver the expected outcomes.
· Contribute to estimating activities, driving towards approaches that balance quality and velocity of delivery.
· Contribute to comprehensive technical documentation that supports the deployment and administration of the solutions we build.
· Contribute to continually refining our development standards and best practices, promoting their reuse and ensuring they are adopted.

About you
· Strong technical expertise using core Microsoft Azure data technologies: Data Factory, Databricks, Data Lake, Azure SQL, Synapse, Data Catalog, Purview
· A strong background in Python development, particularly for data engineering, and SQL
· Experience working with Power BI: Power Query, DAX, Power BI Service
· Experience of team working and task tracking with Azure DevOps following Scrum / Agile methodologies.
· Experience with Azure databases (Synapse EDW, SQL Managed Instance, Azure SQL)
· Excellent communication and interpersonal skills
· Experience with TDD, Continuous Integration, Continuous Deployment, Peer Review, Automated Testing for DataOps solution delivery

Knowledge and experience of the following would be advantageous:

· C# development skills
· Experience of wider Azure services used for infrastructure build and monitoring (ARM, Policy, Monitor, Log Analytics etc)
· Previous experience of delivering DW & BI using Microsoft BI stack (SSIS, SSRS, SSAS)
· Experience of implementing big data/analytics services
· Experience working with an environment where operation support and monitoring of code and systems is part of the culture (DevOps)
· Experience with supporting AI/Machine Learning workflows at production-level scale is a plus
· Experience of working with NoSQL databases such as Cosmos DB or Mongo DB

Senior Data Engineer

Role Senior Data Engineer
Location Aarhus, Denmark
Type of Hiring Contract (12 months with possible extension)

Job Requirements:
· Data infrastructure competencies to deploy and monitor resources required by the domains
· Experienced in designing scalable ETL/ELT processes which are based on data lake & data warehouse
· Experience in TSQL is required, Python is a plus
· Experience with cloud services such as GCP, Azure, or AWS
· Experience with Power Bi and other self-service solutions is a plus
· Experience with cloud data-warehouses such as Snowflake, BigQuery, or Synapse is a plus

Senior Data Engineer

Senior Data Engineer
Permanent
Location – Reading
Onsite/Hybrid

Responsibilities
· Engineering cloud-based data lake patterns to connect and leverage organisational & external data
· Capture, Ingest, Process and Store large volumes of data and activate for multiple data science use cases
· Enable higher quality data to be created & managed in our customers organisations and standardise on the definition of business-critical data
· Develop data products that are commercially viable and technically excellent through utilising the (primarily) cloud native patterns available to the market
· Help to diagnose and fix issues in complex technical environments, provide technical support and guidance through prototyping, testing, build, and launch of data products
· Help to understand customer requirements and work collaboratively with other architects and consultants to design data solutions that deliver the expected outcomes.
· Contribute to estimating activities, driving towards approaches that balance quality and velocity of delivery.
· Contribute to comprehensive technical documentation that supports the deployment and administration of the solutions we build.
· Contribute to continually refining our development standards and best practices, promoting their reuse and ensuring they are adopted.
· Strong technical expertise using core Microsoft Azure data technologies: Data Factory, Databricks, Data Lake, Azure SQL, Synapse, Data Catalog, Purview
· A strong background in Python development, particularly for data engineering, and SQL
· Experience working with Power BI: Power Query, DAX, Power BI Service
· Experience of team working and task tracking with Azure DevOps following Scrum / Agile methodologies.
· Experience with Azure databases (Synapse EDW, SQL Managed Instance, Azure SQL)
· Excellent communication and interpersonal skills
· Experience with TDD, Continuous Integration, Continuous Deployment, Peer Review, Automated Testing for DataOps solution delivery

Additional Skills
· C# development skills
· Experience of wider Azure services used for infrastructure build and monitoring (ARM, Policy, Monitor, Log Analytics etc)
· Previous experience of delivering DW & BI using Microsoft BI stack (SSIS, SSRS, SSAS)
· Experience of implementing big data/analytics services
· Experience working with an environment where operation support and monitoring of code and systems is part of the culture (DevOps)
· Experience with supporting AI/Machine Learning workflows at production-level scale is a plus
· Experience of working with NoSQL databases such as Cosmos DB or Mongo DB