Beechwood Centre, 40 Lower Gravel Road, Bromley, BR2 8GP

AWS – EUC Solution Specialist

Our client is looking for an experienced AWS Specialist to join an exciting project in Romania. This position can be carried out remotley.

Summary of the Role & Responsibilities:

· This role of End User Computing Solution Specialist has the responsibility for implementing EUC projects, aligning to the current strategy, roadmap, reference architectures, and standards
· Working with key stakeholders across Infrastructure Technology to develop and deliver the EUC portfolio work
· Developing business cases, technical studies and primarily project delivery, the individual will operate with the support and guidance of the Domain Architect and will form part of the broader Infrastructure Architecture Team

Requirements | Skills, Knowledge and Abilities:

Must have:
· A minimum of 1 – 2 implementation projects with AWS EUC – Workspace and AppStream
· Strong experience with scripting (e.g., PowerShell, Python)
· Hands-on experience with Amazon Web Services
· Expertise in Microsoft based solutions including AD, GPO, Azure AD, Windows server
· Expertise in Windows 10 design and SCCM, DSC

Nice to have:
· Experience in deploying desired state config model / CI/CD for image management – building the same using AWS platform
· Experience in Microsoft Modern Management platforms such as EndPoint Manager (Intune), Windows Autopilot, Air Watch
· Experience in AWS CloudWatch monitoring, Liquidware Stratosphere
· Experience in Profile management Tools like Liquidware – FlexApp, Profile Unity

Data Modeler

Our client is looking for an experienced Data Modeler to join an exciting project in Romania. This role is 100% remote for Romanian citizens.
This role has the responsibility to design and document data models (conceptual, logical and physical) belonging to multiple subject areas across multiple countries, being an expert in Data Modelling and being able to clearly articulate best practice aroud data modelling to the business.

Summary of the Role & Responsibilities:

Main responsibilities:

· Analysing and translating business needs into long-term solution data models
· Evaluating existing data systems
· Working with the development team to create conceptual data models and data flows
· Developing best practices for data coding to ensure consistency within the system
· Evaluate existing data models and physical databases for variances and discrepancies
· Work with Business Analysts, Data Architects, Software developers, DBAs to achieve project objectives – delivery dates, cost objectives, quality objectives, business customer satisfaction objectives, etc.
· Create entity-relationship diagrams for relational systems and dimensional diagrams for the existing and proposed Data Warehouse, Data Marts, and data cubes
· Define ownership, obtain approvals, and communicate data models to Technology teams
· Ensure consistent enterprise view and usage of data assets across the global functions and standards
· Provide architectural expertise in Data warehouse construction and engineering, data source identification, engineering, and migration on latest DB technologies – Snowflake
· Provide feedback to Business Intelligence project and management team to improve data acquisition (structured and unstructured), warehousing methodologies and tools

Requirements | Skills, Knowledge and Abilities:

Must have:
· Data Modeling concepts (Advanced Level)
· ErWin, Snowflake, DB/MS SQLServer
· Modeling experience across large volume based (Tb/Petabytes) environments
· Over-All understanding of Database space (Databases, Data Warehouses, Reporting, Analytics)
· Well versed with Data Modeling platforms like Erwin, ER STUDIO
· Strong skills in Entity Relationship Modeling (ERWin modeling software preferred)
· Good knowledge about Database Designing / Administration, Advanced Data Modeling for Star Schema design
· Good understanding of Normalized / Denormalized / Star / Snowflake design concepts
· SQL query development for investigation and analysis
· Demonstrable experience using data modeling tools – e.g., ErWin, SQLDBM

Nice to have:
· Open to learn new technologies in-line with organizational needs / direction
· Experience with Data Profiling Tools and BI tools like MicroStrategy
· Experience with the technologies in the Hadoop ecosystem like Hadoop, HDFS, Spark, and Hive
· Experience in working with modern cloud data platforms in AWS (preferred)
· Experience in Agile development processes
· Experience in Cross industry / Quick service restaurant / Retail

Other details:
· Ability to independently learn new technologies and data related approaches
· Effective writing and presentation skills
· Excellent problem-solving skills, should possess a consulting mindset
· Experience with managing meta data for data models
· Demonstrable experience in developing, publishing, and maintaining all documentation for data models
· Significant proven experience in applying rigorous data management principles and industry standard development approach to model complex and high-volume data assets to build, maintain conceptual, logical, and physical data models

Azure DevOps Engineer

Our client is looking for an experienced Azure DevOps Engineer to join a long term contract. this role can be carried out remotley from anywhere in the EU.


· Design and implement Continuous Integration/Continuous Deployment (CI/CD) tooling for the client using Azure DevOps / GitHub Action, and related technologies.
· Define the Branching Strategy
· This includes defining and implementing build and test pipelines to ensure data quality, infrastructure as code (IaC) for the stateful deployment of environments, Role-Based Access Control (RBAC), linting and other code quality controls, as well as building out Azure DevOps releases.
· Assist in the productionalization and maintenance of complex DevOps build and release pipelines to deploy assets
· Release Communication and Collaboration
· Experience in TDD (Test Driven Development, especially with respect to CI/CD and DevOps)
· Good knowledge on security & compliance
· An expert in systems design with considerable skill and expertise in data lake implementation on Azure.
· Experience with Infrastructure as Code (ARM, Terraform, or similar)
· Experience with Test Automation

Technical Skills:

· Strong Azure Cloud Knowledge
· Experience deploying Azure and/or Spark data components (Data Factory, Airflow, Data Lake (ACLs), Synapse)
· Logging frameworks (Azure Monitor, Splunk, Elastic, or similar)
· Monitoring tools (Azure Monitor Application Insights, Dynatrace, New Relic, Naggios, Zabbix)
· Azure DevOps (pipelines and releases) or Github Actions (Yaml build pipelines)
· Prior experience with large Jenkins or similar Ci/CD technology is also acceptable.
· Complete command of source control (git), including branching strategies and policies
· Excellent Bash and Microsoft Powershell skills
· Experience with database deployment pipelines (i.e. dacpacs or similar technology)
· Experience with networking and security elements (VNets, Peering, Firewalls, NAT, etc.)
· (Optional) Experience with containerized architectures, especially Kubernetes
· (Optional) Experience in SSO (single sign-on), and federated security
· (Optional) Experience with MLFlow and other MLOps pipeline technology