RBAC, SSO access control, Setup Clustered IDM environment, Deployed commonly used IDM connectors (Web Service, Active Directory).
1) Should be able to collaborate with cross-functional teams and understand customer needs to delivers the best suited solution. 2) Should be able to Investigate issues and conducts root cause analysis to solve a variety of problems in his/ her area of work. 3) Should be able to communicate clearly, fluently and in an assertive manner. Presents ideas effectively.
MANDATORY: Unix shell scripting, Sparks, Hive, Scala or Java SECONDARY: Nify, KAFKA or SQOOP Good to have DevOps,ControlM
To have 3-6 years of experience -To have hadoop skillsets like HDFS, Hive, Spark, Yarn, Ranger, Ambari with strong SQL expertise to develop queries using business requirements/function specification as input and work on performance optimization as required. -To have experience or basic understanding on change management process to migrate the developed codes from development to test to production using version control applications. -To have knowledge and understanding on any scheduling applications preferable control M. -To have knowledge and understanding on DevOps
Should have minimum 3 to 5 of experience on Cloud environment ● Proficient with Shell scripting and one programming language (Java/Python/Scala) ● Hands on experience on multiple cloud environments AWS/Azure. ● Experience on Azure HDI- Hadoop, Hive, Spark, Presto ● Experience on Real Time Data Ingestion using Kafka cluster ● Good knowledge on Data Migration from S3 to Azure blob ● Good knowledge on VPN set-up is an added advantage. ● Good Knowledge on Securities like Kerberos/SSL, Key managements etc. ● Hands on knowledge on Docker/Kubernetes.
1) Azure/AWS cloud platform set-up and maintenance for clients. 2) Suggesting and implementing best practices for clients on Azure cloud platform. Ex setting up alerts, monitors, resolving packet losses and suggesting other best practices. 3) Setting up devop automation for different flavored projects. 4) Quick spinning up of different clusters on cloud for internal projects. 5) Automation of platform activities including big data cluster set-up, VM spin up etc using Terraform/Ansi.