Bromley Business Centre, 27 Hastings Road, Bromley, Kent BR2 8NA, UK

Azure Data Architect

Job Title Azure Data Architect (Solution Architect)
Hiring Locations Warsaw, Poland
Is remote work an option? No – Hybrid Model (3 Days office and 2 Days WFH)
Must have skill Azure / Databricks or PySpark

Responsibilities:
· Candidate must have at least 10 yrs experience in design and development of ETL solutions in Azure stack
· Strong understanding of Azure service components like SQL DB, SQL Server, Data Bricks, Blob storage, AAS
· Should have a good experience in designing analytics using Synapse, AAS or Power BI cubes
· Should be experienced in migrating on-prem and legacy DWH to Azure stack
· Candidate should have proven experience in Data integration and migration projects.
· Candidate must understand development and deployment of data applications on Azure stack. This includes familiarity with Angular for front-end development and Java full stack product development.
· Candidate must be familiar with Azure network and security models and Storage accounts.
· Must have a working experience in Azure DevOps and CI/CD tools at each phase of the SDLC cycle.
· Should be experienced in developing data ingestion using Logic Apps / Azure functions to call APIs for extracting data from SharePoint and other applications.
· Should understand application development in Azure stack and should be able to guide front
end developers for making required changes as per project requirements.
· Should have a deep understanding of end-to-end infrastructure and services setup in Azure stack to ensure that the platform is optimally configured.
· Experience in defining data quality rules and designing ETL frameworks to execute these DQ rules as part of the ingestion process.
· Should have experience in designing metadata driven generic frameworks tracking ETL audit information.
· Candidate needs to work as a Technical lead and should be able to direct junior team members for proper design and development.
· Candidate needs to work independently with the clients and, therefore, must have excellent communication skills and should be adept at project planning, execution, and status tracking.
· Should have proven experience in working in an onshore-offshore model

Azure Data Architect

Please find update JD and position is Azure Data Architect:

Mode of hiring : FTH / SUBCON / B2B
We need immediate joiners
Pay Rate : 500-600 DKK/hr
Exp : 8 to 12 Years

Data on Cloud
Data warehousing
Data Lake
Any of the Azure Database experience like synapse
Azure ADF
Azure Data factory which is ETL Tool
Power BI for reporting

Azure Data Architect

Azure Data Architect
Warsaw, Poland (Remote work opportunity)
12 Month Subcontract or Permanent Position
Hourly Rate or Salary totally Negotiable
Starting ASAP

Our client are looking for an Azure Architect with below experience:

Detailed Job Description:
· Candidate must have at least 12 yrs experience in design and development of ETL solutions in Azure stack
· Strong understanding of Azure service components like SQL DB, SQL Server, Data Bricks, Blob storage, AAS
· Should have a good experience in designing analytics using Synapse, AAS or Power BI cubes
· Should be experienced in migrating on-prem and legacy DWH to Azure stack
· Candidate should have proven experience in Data integration and migration projects.
· Candidate must understand development and deployment of data applications on Azure stack. This includes familiarity with Angular for front-end development and Java full stack product development.
· Candidate must be familiar with Azure network and security models and Storage accounts.
· Must have a working experience in Azure DevOps and CI/CD tools at each phase of the SDLC cycle.
· Should be experienced in developing data ingestion using Logic Apps / Azure functions to call APIs for extracting data from SharePoint and other applications.
· Should understand application development in Azure stack and should be able to guide front-end developers for making required changes as per project requirements.
· Should have a deep understanding of end-to-end infrastructure and services setup in Azure stack to ensure that the platform is optimally configured.
· Experience in defining data quality rules and designing ETL frameworks to execute these DQ rules as part of the ingestion process.
· Should have experience in designing metadata driven generic frameworks tracking ETL audit information.
· Candidate needs to work as a Technical lead and should be able to direct junior team members for proper design and development..
· Should have a good understanding of LTI quality processes.
· Candidate needs to work independently with the clients and, therefore, must have excellent communication skills and should be adept at project planning, execution, and status tracking.
· Should have proven experience in working in an onshore-offshore model.

Selling point:
This engagement will provide an opportunity to develop new data solutions in Azure stack.
Data ingestion from several external source systems and third-party applications and developing solutions in a very mature data environment with data catalog, data quality and overall data governance integrated into the overall design and development regime.
Candidate will report directly to Data management lead for Europe and will have the freedom to drive his/her deliverables end-to-end while at the same time collaborating with other architects.
Opportunity to work with a global team of data engineers in a hybrid cloud environment involving GCP and Azure.

Tasks:
The detailed daily tasks are to be defined, however, at the high level, the following tasks should be considered:
Working closely with business users to gather data requirements and convert them to techno-functional specs.
Complete design of data solutions in Azure stack including identifying data sources, data modelling, designing and building data pipelines to ingest data to SQL Server using Azure Data bricks, development of AAS cubes for business reporting.
Develop pragmatic and strategic automations for driving better data governance and operations excellence.
Work with upstream and peer teams to setup the correct architecture to ingest data into.
Executing test cases to unit test the development. Provide support through SIT and UAT cycles.
Participate in daily scrum calls and demonstrate daily progress on planned activities.
Provide technical help and guidance across other modules, if required.