• Thoucentric - Junior Data Architect - ETL Tools/Data Warehousing Jobs in Bangalore,India

  • Thoucentric Technology Private Limited

Job Description:

Technical Competencies

- Should have experience in designing and architecting large scale distributed applications

- Must have excellent data modelling skills (relational and dimensional) for Cloud or Big Data applications

- Excellent knowledge of relational database technologies with experience on databases such as Oracle / SQL Server / Amazon Redshift and operational knowledge on NOSQL databases

- Hands on experience on any of the ETL tools (Informatica / Datastage / AbInitio / Alteryx / Talend / SSIS / ) Microsoft SSIS / Azure is preferred

- Data Warehousing skills is mandatory

- Must be familiar with one scripting language - Unix Scripting, Perl scripting, Python etc

- DB Architecture, Data Modelling, Database Design mandatory

- Must have experience on DB / Performance Tuning

- Exposure to Data Migration from on-prim to cloud is desirable although not mandatory

- Experience with anyone Azure is preferred

- The Hadoop stack (Hadoop, Hive, Pig, Hbase, Sqoop, Flume, Spark, Shark, Oozie, etc)

- Microsoft Azure Platform (HDInsight, ADF(Azure Data Factory), Azure - Cloud Services, Event Hub, SQL DW, Data bricks)

Functional Competencies

- Proven experience in developing architecture blueprint, design specifications, data interfaces and integration approaches, infrastructure requirements, etc

- Must be well versed with architecting data-as-a-service layer for servicing Digital assets, decision support systems (BI), Visualization/reporting & analytics

- Good understanding of architecting data ingestion framework capable of processing structured, semi-structured & un-structured data sets in batch & real time and integrating it with internal data to develop real-time actionable insights

- Conduct prototyping with the solutions and drive the requirements to closure with the business users Able to use Open Analytics solutions for quick prototyping, then implement, and operationalize

- Candidate should be able to architect highly scalable distributed systems / big data solutions, using different open source tools, Microsoft Platform is preferred

- Candidate should be able to design, develop, load, maintain and test large-scale distributed systems

- Should be able to focus on analysing and visualizing large sets of data to turn information into insights using multiple platforms

- Translate complex functional and technical requirements into detailed design

- Should be able to install, configure and support Big Data tools

- Maintain security and data privacy

- Propose best practices/standards

- Being a part of a POC effort to help build new big data clusters

- Analytical mind with a problem-solving aptitude

Personal Attributes

- Ability to cope in a complex and fast-changing business environment, and to respond calmly and rationally to changing aspirations in a deadline-driven situation

- Good communication skills with a capacity to present, discuss and explain issues coherently and logically both in writing and verbally

- Good team player, self-motivated and able to work on own initiative

Location Candidates will be based out of Bangalore and open to travel on work related assignments

Skills ETL Tools,Data Warehousing,python, hadoop stack

Job Type - Permanent

Profile Summary:

Employment Type : Full Time
Salary : Not Mentioned
Deadline : 20th Feb 2020

Key Skills:

Company Profile:

Not Mentioned

Taking these free online tutorials can help you get your next job

People who search this job also searched for the following Keywords

Salary trends based on over 1 crore profiles

View Salaries

All rights reserved © 2018 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status