Big Data Engineer

  • Company: Prudential
  • Location: Newark, New Jersey
  • Posted: December 03, 2017
  • Reference ID: ADV000AM
http://video.digi-me.com/digime/jobs/IT/Prudential/Software Developer/ZP0020
  • Build distributed, scalable, and reliable data pipelines that ingest and process data at scale and in real-time.
  • Collaborate with other teams to design and develop and deploy data tools that support both operations and product use cases.
  • Own product features from the development phase through to production deployment.
  • Troubleshoot data issues, deep dive into root cause analysis, recommend, test and implement solutions
  • Develop and document technical requirements and solutions
  • Participate in design and code reviews
  • Troubleshoot issues making recommendations and delivering on those recommendations
  • BS in Computer Science or related area
  • Around 8 years software development experience
  • Minimum 5 years of active SQL development, Performance tuning, Data Modeling and design preferably in Oracle.
  • Minimum 1-2 Year Experience on Cloud computing, AWS preferable.
  • Strong automation skills using Python and Ansible is highly preferred
  • Good analytical and programming experience with Java, Oracle, SQL
  • Flair for data, schema, data model, how to bring efficiency in big data related life cycle
  • Proficiency with agile or lean development practices
  • Strong object-oriented design and analysis skills
  • Excellent written and verbal communication skills

Qualifications:
Top skill sets / technologies in the ideal candidate:

Required:

  • Database - Oracle 11g / 12c, complex SQL queries, performance tuning concepts, Backups, Recovery, DR, BCP 
  • Programming language -- Java, Python, SQL
  • AWS -- RDS, EC2, RedShift, S3 and others. AWS cloud Data migration, AWS Security
  • NoSQL -- HBase, MongoDB, Cassandra, Riak
  • ETL Tools -- Data Stage, Informatica
  • Code/Build/Deployment -- Ansible, Git, SVN, Maven, sbt, Jenkins, Bamboo, Terraform
 
Desired:

  • AWS Associate / Solution Architect certified
  • Batch processing -- Hadoop MapReduce, Cascading/Scaling, Apache Spark, AWS EMR
  • Stream processing -- Spark streaming, Apache Storm, Flink
 
  • Excellent communication and decision making skills are essential.
    Strong analytical, problem solving and decision-making skills.
  • Zeal to lean new technologies, frameworks and appetite for growth
  • Identify project risks and recommend mitigation efforts.
  • Identify project issues, communicate them and assist in their resolution.
  • Assist in continuous improvement efforts in enhancing project team methodology and performance.
  • Cooperative team focused attitude

Share this Job