About the Role
The Hadoop Developer will be a key member of the client’s Business Intelligence Big Data Team and work on the Hadoop platform. The Hadoop Developer will work closely with Hadoop Administrators, Data Scientists, and business stakeholders. This position will be responsible for, but not limited to:
• Develop high-performance data processing pipelines
• Partner with Business Analysts and internal customers to improve our data coverage and analytic capabilities;
• ETL code optimization to produce production ready code
• Documentation of all development code.
• Testing of new tools in the Hadoop ecosystem.
• Opportunity to be cross trained in advanced analytical technique.
• Along with the rest of the team, actively research and share learning/advancements in the Hadoop space, especially related to development
• Ability to take initiative to research, learn and recommend emerging technologies
• Experience working with Apache Spark, Kafka and other big data technologies
• Experience in coding with a variety of languages like Python, Scala, & Java
• Experience in developing Big Data ingestion frameworks or experience in working with ingestion tools
• Demonstrated analytical and problem solving skills, particularly those that apply to a “Big Data” environment.
• Experience with data pipeline E.T.L. tools, such as: Talend (added advantage)
• Experience Shell scripting/Python (added advantage)
• Previous Java experience is added asset
• Strong Communication skills
• Self Motivated
• Willingness to learn
• Excellent planning and organizational skills