Key Responsibilities
Data Engineering
• Any one of the ETL Tools preferably NiFi
• Custom development using Shell & Python programming
• Hadoop Eco system tools such as Sqoop, HIVE etc.
• Scheduling tools.
• Good understanding of RDBMS and SQLs and has worked with Postgres/Oracle Databases
• Good Technical Design, Problem Solving and debugging skills
• Good communication skills and should be able to take ownership.
• Experience in ETL, Ingestion or similar experience is preferred.
Requirements
To be successful in this role, you should meet the following requirements:
• Any cloud platform such as GCP, AWS
• Kafka, Spark and PySpark
• Angular/React framework for UI development.
• Java programming language with experience in Spring/Springboot framework.
• Github, Jenkins, Ansible etc. Understanding of CI/CD concept.