The Building Technology & Solutions organization within Johnson Controls is looking for a Hadoop Admin/ Engineer responsible for design, construction and implementation of data transformation solutions within the Big Data ecosystem. The Application Developer will work closely with the teams across the enterprise for numerous activities from gathering requirements through implementing solutions for retrieving, loading, combining and summarizing data from multiples systems into the Data Lake. The ideal candidate has substantial hands-on development experience working with the technologies encompassed within the Hadoop stack.
The IT Hadoop Admin/ Engineer - Application Developer coordinates, designs, builds, and integrates complex application technology solutions, aligned to architectural standards and definitions, ensuring IT services are delivered effectively and efficiently.
- Requirements gathering, design, development, testing and documentation for various development and support activities.
- Provide user training
- Work with remote development teams to complete projects on schedule
- Analyzes source system data to understand the data structure, definitions and anomalies.
- Interacts with customers as needed to better define and understand the general reporting requirements.
- Interacts with other technical team members to establish ELT requirements and designs.
- Designs, develops and implements ELT processes and procedures
- Ability to program in SQL to perform the data query, extract, transformation and load functions.
- Ability to program in Java to perform the data query, extract, transformation and load functions.
- Well versed in data access and manipulation within Hadoop using tools such as Hive, Pig, Spark, Kafka and HBase.
- Bachelor's degree or equivalent in Information Technology, Computer Sciences or Computer Engineering
- 6+ years experience in IT Development
- 2+ years progressive IT experience including hands-on experience implementing and supporting enterprise integration activities on Hadoop
- Strong Java/J2EE experience
- Must possess some experience working Oracle, PL/SQL, MS
- Must have proven development experience within Hadoop ecosystem (MapReduce, Hive, Pig, HBase, Sqoop, etc.), preferable with Hortonworks distribution
- Proficiency in Scala/Cascading a plus
- Knowledge of Relational, Star Schema and Dimensional database modeling techniques.
- Strong ETL Design and Development skills
- Expert knowledge of key data structures and algorithms
- Excellent written and verbal communication skills
- Experience with the entire Software Development Lifecycle (SDLC)
- Experience with software implementations involving architectures with various data sources or input sources, and reconciling the ontologies in multiple systems.
- Demonstrable ability to manage confidential information with a high degree of integrity
- Ability to quickly learn new technologies, adapt to new environments, and function within a team
Preferred experience using Talend
A little about us:
We’re shaping the future. Together, let’s make a world that’s safe, comfortable and sustainable. Tomorrow needs you.