Snowflake Data Engineer in Wipro Bengaluru,Karnataka,India

Full Time

Website Wipro

This work as a career listing is about Wipro in Bengaluru,Karnataka,India 2022 exclusively on

About the job

Job Description

  • Experience with architecting, designing and developing Big-Data processing pipelines
  • Interest and passion in Big Data technologies and appreciates the value that can be derived from data
  • Snowflake database and AWS glue experience
  • Understanding of the Trading functional knowledge in order to build correct Data Domain Model which can be leveraged to build different Analytical Models.
  • Ingestion of the correct domain from the Source Application (Trading & Supply Application) to Data Lake to build Trade Data Foundation
  • Help build the Master Data Management architecture as understands the base entity which require harmonization
  • Build common trade & supply chain data schema such that it can be leveraged for all different ingestion from all different Trading Application/(s) in the Data Lake
  • Proficiency in Spark/Impala/Hive/Kudu/Stream sets development and experience with Hadoop and spark data processing technologies required.
  • Hands on with Hadoop and the Hadoop ecosystem required – Proven experience within CLOUDERA Hadoop ecosystems (Spark, HDFS, YARN, Hive, HBase, Sqoop, Pig, Hue, etc.)
  • Big Data Technologies exposure (Hadoop, Spark, Scala, Python, Impala, Hive etc.)
  • Implement complex data processing algorithms in real time with optimized and efficient manner using Scala/Java
  • Knowledge of any one of the scripting languages, such as Python, Unix Shell Scripting or PERL etc., is essential for this position.
  • Excellent analytical & problem-solving skills, willingness to take ownership and resolve technical challenges.
  • Excellent communication and stakeholder management skills
  • snowflake database and AWS glue experience

Company: Wipro

Vacancy Type: Full Time 

Job Location: Bengaluru,Karnataka,India

Application Deadline: N/A

Apply Here