Beechwood Centre, 40 Lower Gravel Road, Bromley, BR2 8GP

Data Architect

Role: Lead Data Architect
Location: Paris, France
Type of Hiring : Contract for 6 months
Rate: €900 a day
Hybrid model: 2 days working from home, 3 days onsite

Must be Fluent in French & English

2 years ago the client launched a program to modernize the Data platform. The purpose of this platform is to create new data uses at group level and to implement self-BI. This transformation project is launched with the Snowflake platform (which in their case is hosted on AWS) and PowerBI. They migrated certain data such as Retail.
The client is currently considering the next steps for this platform. The 1st step having been the implementation of the Analytics architecture which is in place. That’s why they are looking for a lead data architect to help them define the next steps for the data platform.

Technical skills needed:
– Data architectures (mastering architectures at the scale of a large group)
– the AWS Cloud
– Snowflake
– Power BI and all its ecosystem
– APIs

Data Architect

Role: Data Architect
Location: Reading, UK
Type of Hiring: Permanent
Salary: GBP95k to GBP110K per annum
Hybrid Model

Must be SC Cleared

Job Description:
· Strong experience in modern data platform architecture (Data Warehouses, Data Lakes, Dimensional Modelling, Streaming vs Batching, etc)
· Knowledge and understanding of key data strategy pillars across people, processes and technology
· Knowledge and understanding of Data Management principles, including Data Governance, Data Quality, Master and Reference Data Management, Metadata Management, Security and Compliance
· Strong experience in Data Warehouse methodologies
· Experience working across Azure Data Services including Azure Databricks, Azure SQL Database, Azure Data Lake Storage, Azure Cosmos DB, Azure Data Factory, Azure Blob storage and Power BI
· Strong experience in SQL
· Good understanding of Azure infrastructure (Subscriptions/Resource Groups, Virtual Networks, Kubernetes, Security options, etc)
· Good understanding of modern reporting platforms (dashboarding, advanced analytics, AI)
· An ability to explain complex problems in a simple and effective manner
· Strong peer and senior stakeholder management skills (including engagement with exec level sponsors)

Knowledge and experience of the following would be advantageous:
· Knowledge of Enterprise Architecture Frameworks
· Good knowledge of Azure DevOps Pipelines
· Strong experience in Apache Spark framework
· Previous experience in designing and delivering data warehouse and business intelligence solutions using on-premises Microsoft stack (SSIS, SSRS, SSAS)
· Knowledge of any other enterprise product/tool for CI/CD e.g.: Terraform, Jenkins, etc.

Data Architect

We have two new requirement for a Data Architect role based in Singapore.

Required Details:
· Client: HSBC & SCB (Standard Chartered Bank)
· # of Role: 1 for each client
· Hiring Location: Singapore (First Preference. HK can be as second location)
· Hiring Term: FTE
· Level of Exp: 10+ Years
· Offered Salary: SGD 200K Per Annum (Max)
· Visa Term: PR/Citizen/Singaporean (For Singapore) / HK – Open

Job Decsription
Our client, a global leader in financial services, is looking for a Data Architect to join their team, and provide technical leadership for architecture on data-related projects
Responsibilities Include:
· Implementing future state data architecture, analysing challenges and recommending data architecture solutions
· Providing, when required, architecture support for each release
· Defining and presenting solutions designs, according to governance and standards
· Liaising with project teams and key stakeholders to meet data management and integrity control objectives
· Identifying ways to simplify, improve, automate and conform to data controls, and producing data models and roadmaps
· Identifying operational and delivery risks
· Providing data architecture governance, standards and target operating models
· Put in place product and feature checks delivered by pods, ensuring adherence to standards and principles
· Representation at key forums, groups and boards
· Building and maintaining data models, enabling seamless implementation of data models into production
· Ensuring architecture results are compliant with requirements, approved technologies, best practices and strategies.

Key skills and experience:
· A proven record and over four years in Data Architecture, gained within financial services, banking and large enterprise
· A background with Data Warehouses and RDBMS, ideally with Oracle, Teradata, DB2, MSSQL or MySQL, Data Modelling, and the ability to connect blocks and components
· A technical background and understanding of major cloud platforms
· Using design tools, such as EA, ArchiMate, IDA, Erwin and Visual Paradigm, among others
· An understanding of SDLC, Agile environments and enterprise architecture methodologies, for example TOGAF
· Knowledge of MQ, ETL, API Management and Web Services
· Managing stakeholders and working with diverse, cross-functional teams located locally and globally
· An ability to quickly develop and nurture strong working relationships
· Strong decision-making, problem-solving, analytical and interpersonal skills
· Excellent written and verbal communication skills, with ability to communicate complex ideas to non-technical people

Data Architect

Our client is looking for an experienced Data Architect to join a world leading organisation. This role can be carried out remotley from anywhere in the UK, with occasional travel to London.

Major Responsibilities:
Drive use cases & requirement gathering workshops with client & Brillio stakeholders for data engineering and business intelligence workstreams
Synthesize data analyses into clear, sound recommendations; takes responsibility for structuring and writing reports and client ready presentations from a work stream or project level.
Generates key hypotheses and independently structures work at the work stream or project level
Works collaboratively with client stakeholders & Brillio architects on:
Data modelling and data architecture design
ETL, data integration and data migration design
Master data management system and process design.
Data quality system and process design
Develops effective working relationships with global teams and business partners
Reviews data engineering & analytics end-products to ensure accuracy, quality, and timeliness.
Have ears to ground: collect and synthesize feedback from clients, project delivery analysts, and sales teams for new solutions or product enhancements.
Proactively seeks new knowledge and structures project work to facilitate the capture of intellectual capital with minimal oversight

Qualification and Skills Required:
15+ years of experience in hands-on business intelligence and analytics tools, data modelling, data staging, and data extraction processes, and experience in advising clients on their data strategy, roadmap and use cases
Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
Advanced degree in Economics/Statistics/Mathematics or B.A. in a technical or quantitative discipline with an emphasis on business applications. M.S./M.B.A. preferred
Experience of working in the Quick Service Restaurants and/or Retail/CPG space
Business knowledge of POS systems such as NCR Aloha / Compris, Transight is a plus
Strong MS-Excel and PowerPoint skills and excellent client facing communication
Understanding of data privacy requirements (GDPR)
Must have Experience in Data Analysis / Some Experience in Big Data tools.
Cloud: Microsoft Azure, AWS
BI: Microstrategy, Power BI
Experience in working with Atlassian JIRA and Confluence.
Cloud, Business intelligence and analytics certification an advantage.

Data Architect

Start date: 16th Aug or ASAP

JD Below:
Overall development experience (using java and Big Data) : 15+ years
Big data experience: 5+ years

Must Have:
Person must have worked on big data technologies like Hadoop, Hive, Hbase, Tableau etc.
Person must have deep understanding of what big data technology to use in which scenario
Person must have experience working with reporting using Big data
Person must bring good experience of using JAVA with big data stack (Dont need Python & Scala)
Person must be able to demonstrate solutioning skills for enterprise grade solutions
Person must have worked with different storage techniques for Big data.
Person must have experience working with visualization technologies like Qlikview, Qlik Sense, Google data management studio etc.
Person must have delivered in a reporting solution in production
Person should bring understanding of Cloud technologies, AWS, Azure or GCP.
Person should have worked on Microservices domain.

Nice to have:
Person should have experience working with trade life cycle
Person should understand Finance domain
Person should have experience working with regulatory reporting.