Job Title: Data Analyst / Data Architect
Location Address: WFH (Scarborough, ON)
Contract Duration: 6 months
Our large financial client needs 1 strong Data Analyst / Data Architect to support two large regulatory initiatives for upcoming projects for both TSYS & PCMLTFA initiatives.
Candidate Value Proposition
• The successful candidate will have the opportunity to work on multiple initiatives gaining exposure to multiple bank streams and utilizing new technology. The individual is comfortable working with business and technical staff ensuring systems are designed and maintained according to enterprise architectural standards. Collaborating with team members, they will utilize agile best practices and metrics to build high quality technology solutions in line with the product's vision.
The main function of the Data Analyst is to provide business intelligence support and supporting areas by means of both repeatable and ad hoc reporting delivery reports (charts, graphs, tables, etc.) that enable informed business decisions.
Typical Day in Role
• unit testing of solutions to optimize, create efficiencies, address root cause incidents or
• Create and maintain detailed design documents as well as supporting the lifecycle of those documents.
• Create and input meta data for ingestion into the EDL
• Build and test integration software solutions.
• Help maintain code quality, organization, and performance.
• Participate in technical meetings with client's technical specialists.
• Provide support for testing efforts and defect resolution.
• Provide deployment and post deployment support (Ex. warranty support, command center services, process, review release content and coordinate with clients.
• Focus on logical support work, like configuration, data management, application performance and tuning, application troubleshooting
Project:
Must have:
1. 3 + years' minimum experience with heavy Data Modelling: ingesting – logical & physical data modelling for multiple sources for multiple sources*
2. Must have experience with Erwin Modeling (Minimum 3 recent projects)
2. 6 + years ingesting data into the EDL (Enterprise Data Lake) using HFDS/ Hive
3. 3 -4 years’ minimum experience supporting the ingestion process by creating the meta data that is used during the process – create meta data for ingestion (will be working with the team to give meta data for ingestion) – needs to have good exp. with Hive and HDFS
4. Excellent communication skills to support Business Consumers in Data Analysis and Data Consumption by developing queries for business stakeholders
Nice to have:
– RDBMS, XML, COBOL Copybook exp. is a plus
– Financial Industry exp. is a plus