Data Engineer
Typical Day in Role:
• Hands-on Engineering of Data Processing applications
• Experience with automated deployment of code using CI/CD tools like Jenkins, Bitbucket, Docker, Artifactory
• Experience with Big Data technologies, assisting clients in building software solutions that are distributed, highly scalable and across multiple data centers
• Hands on experience in architecting Big Data applications using Hadoop technologies such as Spark, , HDFS, Hive, Sqoop, HBase, Python
• Database setup/configuration/troubleshooting
• Automating work schedules with Airflow
• Strong Experience with event stream processing technologies such as Spark streaming, Storm, Akka, Kafka
• Experience working with Python & Spark
• Extensive experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR)
• Experience working with Business Intelligence teams, Data Integration developers, Data Scientists, Analysts and DBA’s to deliver well-architected and scalable Big Data & Analytics eco-system
• Proven track record of architecting Distributed Solutions dealing with real high volume of data(petabytes)
• Strong troubleshooting and performance tuning skills.
• Experience with SQL and scripting languages (such as Python, R)
• Deep understanding of . Google Cloud computing infrastructure and platforms
• Good understanding of Big data design patterns
• Ability to analyze business requirement user stories and model it to domain based services
• Experience working under agile delivery methodology
Must haves/Requirements:
• A minimum of 4 years’ experience in hands-on development as Data Engineer or related field
• 4+ years’ experience deploying data engineering solutions in a production setting
• 3+ years of hands-on experience with GCP is required
• Experience with Python & Spark. (Production level coding)
• Capability to architect highly scalable distributed data pipelines using open source tools and big data technologies such as Hadoop, HBase, Spark, etc.
• Experience of designing scalable solutions with proficiency in use of data structures and algorithms
• Previous work experience with Data Modeling (Basic Level) & Data Architecture
Nice to have:
• Experience architecting, implementing, and customizing digital analytics/customer data platform solutions with the Adobe Experience platform
• Expertise in data governance for big data space with knowledge of various MDM/entity resolution solutions
• Previous experience with Adobe’s Experience Platform (AEP) Real-time CDP solution
• Experience in cloud-based environment with PaaS & IaaS
• Previous experience with AEP digital marketing platform, Adobe tech stack.
Experience with Talend/ ETL tool.
Soft skills:
• Excellent written, presentation, and verbal communication skills
• Ability to work as part of a team, as well as work independently or with minimal direction..
• Collaborate with data architects, modelers and IT team members on project goals.
Education/Experience:
• Bachelor’s degree in a technical field such as computer science, computer engineering or related field required
• GCP certification is a plus