Job Description:

- Hands-on experience on PySpark (2-4 years), good ETL knowledge,develop scripts for data, data sanity check, validation Construct & maintain data models,

- Coordination with the other teams(like API, UI/UX, QA/Testing etc), Data dictionary creation

- Experience with Pyspark, Mssql GIT

- Must have worked on architecting big data solutions

- Design and implement an end to end data product

- Given a data problem, should be able to architect the best solution and can build a prototype

- Sound knowledge in big data technology stack and can work with any tools for a given data problem

- Technical stack covering batch processing, analytics, configuration and deployment

- Must have worked in Agile Projects

- Exposure to using Devops tools

- Must have experience in leading small teams

Nice to have skill Should be able to create tools in the data space [can be for data engineering, data ingestion, data analysis, modeling etc]

Nice to have skill conceptualize, define and prototype frameworks

Profile Summary:

Employment Type : Full Time
Salary : Not Mentioned
Deadline : 18th Mar 2020

Key Skills:

Company Profile:

Not Mentioned

Would you like to try out these free online tutorials?

People who search this job also searched for the following Keywords

Salary trends based on over 1 crore profiles

View Salaries

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status