Depends on position,based upon the companys/projects, roles and responsibilities, the requirements will change. Here there are few Job Descriptions/combinations.
Based on that, you can develop your skills.
Good understanding of AWS Data Stack including services like S3, Redshift, EMR, Lambda, Athena and Glue. Good understanding of Hadoop/HDFS
• Designing and developing ETL jobs across multiple platforms
• Experience in developing DB schemas, creating ETLs, and familiar with MPP/Hadoop systems
• Good knowledge of Operating Systems (Unix or Linux)
• Good understanding of Data ware House methodologies
• Hands on experience in any of the programming languages python.
• Good hands on with sql.
• Good written, oral communication and presentation Skills
• Knowledge of Big Data eco system like Hadoop M/R, Pig and Hive is a strong plus
• ~4-6 years of overall experience
• Good knowledge on Agile principles and experience working in scrum teams using Jira
Good to have work experience in service based companies deployed in bigger product based companies
No comments:
Post a Comment