Data Engineer
Location:
Dallas , Texas
Posted:
October 24, 2017
Reference:
48032-en_US

New York Life Insurance Company (“New York Life” or “the company”) is the largest mutual life insurance company in the United States*. Founded in 1845, New York Life is headquartered in New York City, maintains offices in all fifty states, and owns Seguros Monterrey New York Life in Mexico.

New York Life is one of the most financially strong and highly capitalized insurers in the business. The company reported 2016 operating earnings of $1.954 billion. Total assets under management at year end 2016, with affiliates, totaled $538 billion.  As of year-end 2016, New York Life’s surplus was $23.336 billion**.  New York Life holds the highest possible financial strength ratings currently awarded to any life insurer from all four of the major ratings agencies: A.M. Best, A++; Fitch AAA; Moody’s Aaa; Standard & Poor’s AA+. (Source: Individual Third Party Ratings Report as of 8/17/16).

Financial strength, integrity and humanity—the values upon which New York Life was founded—have guided the company’s decisions and actions for over 170 years.

 

Data Engineer, Enterprise Data Management (Hadoop)

 

Dallas or NY Metro (Jersey City)

 

About Enterprise Data Management:

 

Enterprise Data Management is charged with creating and delivering a technology strategy for Data & Analytics. Enterprise data management encompasses several functions including strategy and program governance, enterprise data architecture, solution delivery and data governance. This engineering role provides a unique opportunity to help architect and build the platform for one of our most strategic initiatives.

 

The Role:

 

This Data Engineer will be responsible for the building of all pipelines and ingestion of source data into our Enterprise Data Lake. Based on the business strategy for data & analytics, enterprise data management is creating a robust enterprise data lake ecosystem. It is expected that this engineer will architect and build out our core framework for ingestion of data sources in a batch, streaming and change data capture. This role will require an advanced skillset across a variety of technologies. This individual will often have to learn on their own and remain on the cusp of new technologies in the Big Data and Analytics space.

 

Primary Responsibilities:

 

  • Accountable for designing and delivering against New York Life’s data technology strategy
  • Work with a team of engineers and developers to deliver against the overall technology data strategy
  • Ensure enterprise data ingestion/movement platforms are standardized, optimized, available, reliable, consistent, accessible and secure to support business and technology needs
  • Oversee enterprise data stores, warehouses, repositories, schemas, catalogs, access methods and other enterprise related data assets
  • Understand data related initiatives within New York Life and engineers optimal designs and best solutions
  • Leverage open source and vendor based products to drive scalability and efficiencies throughout the data pipeline life cycle
  • Collaborate with peers across Enterprise Data Management, to deliver on the overarching strategy
  • Develop framework, metrics and reporting to ensure progress can be measured, evaluated and continually improved
  • Stay current and informed on emerging technologies and new techniques to refine and improve overall delivery

 

  • Translate and ensure enterprise and industry security requirements are adhered to especially around the usage and protection of data

 

Qualifications:

 

  • 10+ years in a variety of technology – especially Linux, Web, Databases and Big Data (Hadoop)
  • Deep expertise in data related tools including latest data solutions (e.g. – Big Data, Cloud, In Memory Analytics, etc.)
  • Hands-on experience with Hadoop, NoSQL DBS (eg – MongoDB, MarkLogic, etc.,) and insights on when to recommend a particular solution
  • Solid experience in standing up enterprise practices for Big Data, Analytics, Self-Service
  • Proven track record for identifying, architecting and building new technology solutions to solve complex business problems
  • Capable of working with open source software, debugging issues and working with vendors toward effective resolution
  • Minimum Bachelor’s Degree in relevant field; Master’s Degree a plus

 

 

Competencies:

 

  • Thinks strategically - sets overall direction for solution design and delivery for enterprise platforms aligned to the data & analytics strategy
  • Results Driven - sets aggressive goals and is accountable for continuously driving improved outcomes, leading change and ensuring high standards
  • Excellent communication skills, both written and verbal in conveying technical design and approach for delivering technical solution
  • Pragmatic in his/her approach, delivering incrementally and demonstrating value
  • Strong ability to translate business requirements into technology workflows
  • Ability to help train/develop less senior people on the team
  • Other competencies: critical thinker, adaptable, self-starter, demonstrates sound judgment.

 

Skills:

  • Strong command of SQL – best practices, optimization, troubleshooting, debugging
  • Strong knowledge of RDBMS and Enterprise Warehouses
  • Some hands on knowledge of Apache Hive and HBase
  • Proficient with Unix/Linux (building/assembling packages, shell scripts, configuration management and OS tuning)
  • Proficient with configuration management/automation tooling (Puppet/Chef/Salt)
  • Good understanding of Hadoop technologies (YARN, MR, Tez, Spark, etc.)
  • Experience with Java, Python and API's (JSON)
  • Experience with Kerberos and best practices for securing data a plus
  • Experience working with Vendors/Open Source in the Hadoop ecosystem
  • Knowledge of the open source community (opening issues, tracking issues and identifying problematic issues ahead of time by tracking open JIRA issues in the community)
  • Experience with version control and continuous integration (Git, Bamboo, Jenkins)
  • Experience with API Lifecycle Management and Software Lifecycle Management
  • Understanding of Networking (tracing, packet capture, etc.)

 

 

Location is JC, Dallas

 

 

EOE M/F/D/V

If you have difficulty using or interacting with any portions of this Web site due to incompatibility with an Assistive Technology, if you need the information in an alternative format, or if you have suggestions on how we can make this site more accessible, please contact us at: (212) 576-5811.

*Based on revenue as reported by “Fortune 500, ranked within Industries, Insurance: Life, Health (Mutual),” Fortune Magazine, June 17, 2016.  See http://fortune.com/fortune500/  for methodology.

**Total surplus, which includes the Asset Valuation Reserve, is one of the key indicators of the company’s long-term financial strength and stability and is presented on a consolidated basis of the company.

1. Operating earnings is the key measure use by management to track Company’s profitability from ongoing operations and underlying profitability of the business. This indicator is based on generally accepted accounting principles in the US (GAAP), with certain adjustments Company believes to be appropriate as a measurement approach (non GAAP), primarily the removal of gains or losses on investments and related adjustments.

2. Assets under management represent Consolidated Domestic and International insurance Company Statutory assets (cash and invested assets and separate account assets) and third party assets principally managed by New York Life Investment management Holdings LLC, a wholly owned subsidiary of New York Life Insurance Company.


A little about us:
From sales careers to professions in insurance and investments, we lead the way.

Know someone who would be interested in this job? Share it with your network.