Data Architect/Modeller – Senior
Ministry – Government
6 month contract
No Clearance Required
5 days onsite
Must Have:
– Proficiency in SQL and python, with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
– Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols.
– Experience in design, development and implementation of data models for analytics and business Intelligence
– Knowledgeable in BI modelling methodologies (Inmon, Kimball, data vault), data mapping, data warehouse, data lake and data lakehouse for enterprise.
– Strong understanding of data quality principles, with the ability to design and implement automated data quality checks using tools such as Python and SQL, ensuring data integrity across pipelines and models.
Experience with middleware and gateways
Experience in designing/developing an automated data distribution mechanism
Responsibilities:
- Develops and implements the data architecture for application development in a complex and distributed environment, including the determination of the flow and distribution of data, the location of databases, and data access methods
- Provides a standard common business vocabulary, expresses strategic data requirements, outlines high level integrated designs to meet these requirements, and aligns with the enterprise strategy and related business architecture
- Define conceptual, logical model and physical model mapping from data source to curated model and data mart.
- Design dimensional data mart models, create source-to-target-mapping documentation, design and document data transformation from curated model to data mart
- The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions, ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present.
· Development of documentation and materials as part of a review and knowledge transfer to other members
· Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed
General Skills:
Technical Experience (30%)
– Proficiency in SQL and python, with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
– Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols.
– Knowledge of performance considerations for different database designs in different environments.
– Knowledge and experience in information resource management tools and techniques.
Data Architecture & Modeling (50%)
– Experience in design, development and implementation of data models for analytics and business Intelligence
– Knowledgeable in BI modelling methodologies (Inmon, Kimball, data vault), data mapping, data warehouse, data lake and data lakehouse for enterprise.
– Strong understanding of data quality principles, with the ability to design and implement automated data quality checks using tools such as Python and SQL, ensuring data integrity across pipelines and models.
– Experience in structured methodologies for the design, development and implementation of applications.
– Experience in systems analysis and design in large or medium systems environments.
– Experience in the use of data modelling methods and tools (e.g. ERWIN, VISIO, PowerDesigner) including a working knowledge of metadata structures, repository functions, and data dictionaries.
– Experience in monitoring and enforcing data modelling/normalization standards.
– Experience in developing enterprise architecture deliverables (e.g. models).
Agile Product Development (20%)
– Experience working in an agile, sprint-based development environment
– Understanding and working knowledge of iterative product development cycles (Discovery, Agile, Beta, Live)
– Experience collaborating and sharing tasks with multiple developers on complex data product deliveries
– Experience contributing to version-controlled, shared codebases using git (Azure DevOps, GitHub, Bitbucket) and participating in pull request code reviews.
- Desirable Skills:
- Experience with middleware and gateways
- Experience in designing/developing an automated data distribution mechanism
- Knowledge and understanding of object-oriented analysis and design techniques.
- Experience in developing enterprise architecture deliverables (e.g. models) based on Ontario Government Enterprise Architecture processes and practice
- Knowledge and understanding of Information Management principles, concepts, policies and practices
- Experience creating detailed data standards to enable integration with other systems
- Experience reviewing conceptual, logical and physical data models for quality and adherence to standards
- Knowledge and understanding of dimensional and relational data models
- Knowledge and experience in information resource management tools and techniques