This site uses cookies. To find out more, see our Cookies Policy

Data Lake Architect (Hadoop) in Irving at Vaco

Date Posted: 6/14/2018

Job Snapshot

Job Description

Data Lake Architect (Hadoop)

DETAILS

Location: Irving, TX

Position Type: Direct-Hire

Hourly / Salary: BOE!

JOB SUMMARY

Vaco Technology is currently seeking a Data Lake Architect (Hadoop) for a Direct-Hire opportunity located in Irving, TX. The Data Lake Architect (Hadoop) will build and architect next-generation Hadoop Data Lake and Analytics applications on a group of core Hadoop technologies. The Data Lake Architect (Hadoop) will evaluate new technologies and products, and research to identify opportunities that impact business strategy, business requirements and performance that can accelerate access to data and automate key data flows (Advanced Machine Learning Technologies, etc.). The Data Lake Architect (Hadoop) will develop highly scalable and extensible Big Data platform which enables collection, storage, modeling, and analysis of massive data sets from numerous channels.

  • Define / Maintain Data Architecture - Create Strategy / Research Technologies / Application of Technology to Enable Business Solutions.
  • Assess / Provide Recommendations on Business Relevance / Timing / Deployments, etc.
  • Analyze Big Data Analytic Technologies / Innovative Application in BI Analysis / New Service Offerings - Adopt / Implement Insights / Best Practices, etc.
  • Perform Architecture Design / Data Modeling / Implementation of Big Data Platform / Analytic Applications, etc.
  • Enable Big Data / Batch / Real-Time Analytical Solutions that Leverage Emerging Technologies
  • Facilitate Gathering Data from a Variety of Different Sources / Getting it in the Right Formats / Assuring it Adheres to Data Quality Standards / Assure Downstream Users Quick Access to Data
  • Ensure Proper Configuration Management / Change Controls are Implemented During Code Migrations
  • Conduct Performance Tuning of Hadoop Clusters
  • Monitor / Manage Hadoop Cluster Job / Performance Capacity Planning / Security, etc.

JOB REQUIREMENTS

  • Data Lake Architect - Build / Architect Hadoop Data Lake / Analytics Applications
  • Develop Highly-Scalable Big Data Platforms - Collection / Storage / Modeling / Analysis of Massive Data Sets
  • Conduct Performance Tuning of Hadoop Clusters
  • Monitor / Manage Hadoop Cluster Job / Performance Capacity Planning / Security, etc.
  • Define / Create / Maintain Data Architecture / Strategy
  • Perform Architecture Design / Data Modeling / Implementation of Big Data Platform / Analytic Applications
  • Tableau Experience

Job Requirements

Data Lake, Architect, Hadoop, Data Modeling, Capacity Planning, Tableau, Performance Tuning,