Data Engineer (DWH)
Location:
Sunnyvale , California
Posted:
April 09, 2017
Reference:
1695413709
LinkedIn is a deeply data driven company with data driving not only business decisions but also product features and direction -- data is deeply embedded in the LinkedIn DNA. The company has a diversified business model with revenue coming from Talent Solutions, Marketing Solutions and Premium Subscriptions products.
The Data Analytics and Infrastructure team is responsible for building and maintaining the state-of-the-art ETL infrastructure that makes this data available and accessible to the entire company to make data driven decisions. The team works closely with Data scientists, Product Managers, Executives and other key parts of the business across the globe to understand their data requirements and build appropriate systems and platform that meet or exceed those needs. Data Analytics and Infrastructure team is committed to being an early adopter and contributor to open source big data technologies. Engineers are encouraged to think out of the box and play with latest technologies and explore their limits. LinkedIn is looking for a "Rockstar" Data Warehouse Engineer to help build, scale and maintain the critical data warehouse.
Responsibilities
•Contributing at a senior-level to the data warehouse design and data preparation by implementing a solid, robust, extensible design that supports key business flows. •Performing all of the necessary data transformations to populate data into a warehouse table structure that is optimized for reporting. •Establishing efficient design and programming patterns for engineers as well as for non-technical peoples. •Designing, integrating and documenting technical components for seamless data extraction and analysis on big data platform. •Ensuring best practices that can be adopted in Big Data stack and share across teams and BUs. •Providing operational excellence through root cause analysis and continuous improvement for Big Data technologies and processes and contributes back to open source community. •Contributing to innovations and data insights that fuel LinkedIn's vision and mission. •Working in a team environment, interact with multiple groups on a daily basis (very strong communication skills).
Basic qualifications
•BS, MS or PhD in Computer Science or related technical discipline •2+ years of relevant work experience •Working experience with Hadoop projects/infrastructure •Experience with data warehouse best practices in Big Data space •Experience in the Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, Flume, Sqoop, etc. •Experience with at least one scripting language (Shell, Python, Perl etc. ) •Experience with an OO programming language like Java.
Preferred qualifications
•4+ years of relevant work experience •Experience working extensively in multi-petabyte DW environment •Experience in engineering large-scale systems in a product environment •Ability to write, analyze, and debug SQL queries •Exceptional Problem solving and analytical skills •Experience with Data Warehouse design, ETL (Extraction, Transformation & Load), architecting efficient software designs for DW platform. •Knowledge of database modeling and design in a Data warehousing context •Knowledge of NoSQL stores is a plus •Experienced in implementing data warehouses with MPP databases like Teradata. [is a plus] •Ideal candidates will have a deep understanding of technical and functional designs for Databases, Data Warehousing, Reporting, and Data Mining areas •Passion for about continuous learning, experimenting, applying and contributing towards cutting edge open source Big Data technologies and software paradigms

A little about us:
LinkedIn's vision is to create economic opportunity for every member of the global workforce. Our employee talent is our #1 operating priority.

Know someone who would be interested in this job? Share it with your network.