Job Description

Hadoop Engineer

Overview

Serving as a Hadoop Platform Engineer, this Consultant will be responsible for building enterprise data management platforms. Based on the business strategy for data & analytics, enterprise data management is creating a robust enterprise data lake ecosystem. It is expected that this Engineer will architect and build out the core data platforms for the enterprise. This role will require an advanced skill set across a variety of technologies. This individual will often have to learn on their own and remain on the cusp of new technologies in the Big Data and Analytics space.

 

Responsibilities

Work with a team of engineers and developers to deliver against the overall technology data strategy

Ensure enterprise data platforms are standardized, optimized, available, reliable, consistent, accessible and secure to support business and technology needs

Oversee enterprise data stores, warehouses, repositories, schemas, catalogs, access methods and other enterprise related data assets

Understand data related initiatives within the company and engineer optimal designs and best solutions

Drive knowledge management practices for key enterprise data platforms and collaborate within solution design and delivery

Develop framework, metrics and reporting to ensure progress can be measured, evaluated and continually improved

Stay current and informed on emerging technologies and new techniques to refine and improve overall delivery

Qualifications

Bachelor’s Degree in relevant field; Master’s Degree a plus

8+ years experience in a variety of technologies including but not limited to Linux, Web, Databases and Big Data (Hadoop)

Deep expertise in data related tools including latest data solutions (eg – Big Data, Cloud, In Memory Analytics, etc.)

Hands-on experience with Hadoop, NoSQL DBS (eg – MongoDB, MarkLogic, etc) and insights on when to recommend a particular solution

Solid experience in standing up enterprise practices for Big Data, Analytics, Self-Service

Proven track record for identifying, architecting and building new technology solutions to solve complex business problems

Capable of working with open source software, debugging issues and working with vendors toward effective resolutio

Proficient with Unix/Linux (building/assembling packages, shell scripts, configuration management and OS tuning)

Knowledge of configuration management/automation tooling (Puppet/Chef/Salt)

Solid understanding of Hadoop technologies (YARN, Hive, MR, Tez, Spark, etc.)

Experience with Java, Python and API’s

Knowledge of enabling Kerberos and best practices for securing data a plus

Knowledge of the open source community (opening issues, tracking issues and identifying problematic issues ahead of time by tracking open JIRA issues in the community)

 

 

Apply Now

© 2023 DTG Consulting Solutions, All Rights Reserved
Privacy Policy