Technical Architect/Big Data Hadoop Developer
Posted: January 27, 2017
Reference ID: 791390249
This position will be responsible for custom application development following industry best practices in the Big Data space. As an architect, it is expected to produce custom big data solutions by working closely with business users throughout all phases of the software development lifecycle. This architect will help design, prototype, evaluate, develop, and test software for Hadoop based systems running on-prem and the cloud. Experience with implementing AWS cloud services is a plus. The work will involve setting up a big data cloud environment, validating implementation strategies and options, and porting functionality from existing systems to the AWS cloud. All the work will be developed in an Agile development SDLC/environment.
This person will work both collaboratively and independently among a team of other application developers in the areas of requirements gathering, learning industry and company best practices, solution architecture and software development methodology. The responsibilities of the individual in this position include:
Experience in understanding new architectures and ability to drive an independent project from an architectural stand point.
Analytical and problem solving skills.
Ability to learn new technologies and business requirements.
Ability to build and deliver presentations to all levels of the business and effectively explain complex issues and concepts in simple, understandable language
Develop software project with AWS DevOps tools, technologies and APIs associated with IAM, CloudFormation, VPC, AMIs, SNS, EC2, EBS, S3, RDS, VPC, ELB, Route 53, Security Groups, etc .
Review, analyze, and modify software systems as needed Produce scripts/scripting to integrate with team's automated process and tooling for build/deploy
Collaborate with users and other IT teams to maintain, develop and deploy the best solutions
Establish Big Data and Cloud technology standards with the team and other development managers The successful candidate will have extensive demonstrable skills and experiences including the following: Extensive experience in Solution Architecture, Design and Development for one or more projects.
Should be well versed with Big Data Landscape, Data warehousing and Business Intelligence.
Should have hands-on experience in any of the big data technologies like Map reduce, Hadoop, Hive, Spark, Scala, Flume, Sqoop, Hbase etc . to complete POCs, review code and guide the team. Should have had experience in playing the role of a lead architect for at least 2-3 Big Data implementations.
Should have in-depth experience in implementing Big data platforms on one of the leading distributions (Cloudera, Hortonworks or MapR).
Candidate should also have good knowledge of Big Data architecture patterns, design patterns, estimation techniques, performance tuning and trouble shooting
Application design, development, delivery and support on Microsoft Windows platform: Windows, IIS, SQL Server, .net
Experience automating the build and deployment of AWS infrastructure and platforms using Chef and/or similar tools
Knowledge of SDLC as it relates to systems, such as AWS Cloud technologies, SQL, Java, etc (Required)
Agile software development practices, i.e., Scrum/Lean/Kanban/XP, TDD/BDD, CI/CD C#/.net, Object Oriented Programming (OOP) and open source frameworks (Nuget) to build MVC web applications and .net services
MS-SQL Server and related services: SSIS, SSRS
Utilize development tooling to track and automate software delivery through the SDLC: e.g., Jira, Confluence, Git, TeamCity, OctopusDeploy, Powershell.
Experience with alternatives to Microsoft stack: e.g, RabbitMQ, MongoDB, Redis, NodeJs, AngularJs, Linux, Mac , iOS, Xamarin
Proven ability to complete assigned work in timely and quality manner
Effectively collaborate with peer group (Scrum team)
Effectively communicate with our Business Partners: end-users, user groups and product owners