Job Title: Data Modeler
Duration: 4 months( possibility for extension)
Location: Toronto
Story Behind the Need
• Business group: As a part of the Enterprise Data Architecture team, the Data Modeler is required to translate requirements into desired end state logical data models that are sustainable, adaptable and aligned to business and enterprise needs. And also conform to and implement Enterprise architecture best practices for Conceptual, Logical and Physical Data Modeling, in the efforts in adopting the new BNS Enterprise data strategy which is shifting from various source systems to Google Cloud Platform.
• Reason for request: Project
Candidate Value Proposition
The successful candidate will have the opportunity to work Downtown in a friendly environment with a great work culture, while gaining hands on experience with cutting edge technology.
Typical Day in Role
• Work with the team to Design and build data models for ingestion from various source systems to Google Cloud platform.
• Model the Logical Structure and convert into Physical structure in Google Cloud supported Databases
• Understand new data sources and provide Modeling recommendation and integrate with existing models.
• Assist the business liaison and ETL function with data related issues such as assessing data quality, data consolidation, data
• lineage etc.
• Work closely with other developers in the team, business analyst, QA analyst and project managers in developing project
• Estimates
• Designs Dimensional/3NF data models/BigData Compliant data structures.
• Ensure conformance to Enterprise Architecture data modeling best practices.
• Work with various cross functional groups; i.e. working with data domain owner, business/data analysts, architects, and
• Developers to develop logical models based on business and enterprise requirements.
• Responsible for defining the data structures (logical and physical) to support data integrity, performance and recoverability
• across batch, real-time and near real-time frameworks
• Defines and implements data architecture for corporate data governance and database design principles
• Identify the key facts and dimensions necessary to support the business and requirements, performing the activities necessary
• to support the standardization of entities and attributes.
• Develop entity and attribute descriptions and definitions for the models and endure that conflicts in data models are resolved.
• Perform source data quality assessments-
o Perform Data profiling, Data sampling and analyze to include business constraints in the data model
o Capture business and technical metadata, create and manage data lineage.
• Interface with Data governance teams to leverage central data repositories and be involved in Data Governance related activities
• Assist the Data Architect to ensure the data architecture is applied properly
• Gather requirements apply strong analysis and design skills to build system specification
Candidate Requirements/Must Have Skills:
1) 5+ years Expertise in leading, designing, developing, testing, maintaining, implementing, and documenting data architecture and data modeling (normalized, dimensional, logical, and physical) solutions for Enterprise Data Warehouse and Enterprise Data Marts
2) Strong experience in Cloud Data Modelling mandatory (Either AWS, Azure or Google Cloud – no private clouds or Cloudera) – 3+ years hands on experience
3) Cloud Database Experience (Either BigQuery, Redshift, Cosmos) – 3+ years hands on experience
4) Experience in Data Lake / Data Warehouse implementation – Transformation – 5+ years of experience
5) Master Data Management & Data Governance Experience – 3+ years
6) Experience working in an Agile Environment and ability to work under pressure – 3+ years of experience
7) Experience with Structured & Unstructured Data – 5+ years of experience
8) In-depth knowledge of popular database and data warehouse technologies from Microsoft, Amazon and/or Google (Big Data & Conventional RDBMS), Microsoft Azure SQL Data Warehouse, Teradata, Redshift, BigQuery, Snowflake etc.
9) Fluent in at least 2 programming languages, preferably Java and Python – 3+ years of hands on experience
10) Proficient in data modeling tools such as ER Studio or ERwin or related tools – 5+ years of experience
11) 3+ years Experience in complex large-scale data warehouse and data integration projects.
12) Hands on Coding experience on SQL/PL-SQL – Stored procedures, materialized views, complex queries and performance tuning– 5+ years
13) 5+ years of hands-on experience in data modeling;
14) Financial Services Industry Experience (North American).
15) Open to convert to FTE
Nice-To-Have Skills:
– Having familiarity with Scala
– Knowledge of Capital Markets and Risk Domain
– Knowledge of IBM Financial Services Data Model (FSDM)
– Good knowledge of LAMBDA Architecture and Patterns.
– Hands-on experience deploying in GCP using a combination of BigQuery, CloudSQL, Cloud Storage, Cloud Dataflow
– Experience Datalab, Cloud Dataproc, Cloud Pub/ Sub.
– A certification such as Google Cloud Professional Cloud Architect, Google Professional Data Engineer or related AWS
– Certified Solutions Architect / Big Data or Microsoft Azure Architect
– Understanding of indexing, partitioning and data design performance considerations for industry standard DBMS
– Familiarity with Big Data, in-memory databases, OLAP & other emerging technologies
– Knowledge/experience with performance testing/tuning tools for enterprise applications
Degrees:
• Bachelor's degree in a technical field such as computer science, computer engineering or related field required
Candidate Review & Selection
– 2 Step Process: Phone Interview (communication skills, knowledge), 2nd In Person Technical, Panel Interview