Data Architect in Charlotte, NC at Vaco

Date Posted: 10/17/2020

Job Snapshot

Job Description

**U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.**

The Data Architect will be responsible for overseeing the implementation of data strategy across all product lines. Specific responsibilities include:

  • Participate in the formation of the enterprise data strategy relating to both transactional and analytic data processing.
  • Anticipate business needs and proactively integrate data to enable stakeholders to make data driven decisions.
  • Work on large-scale data projects, build data lakes and Data fabric.
  • Map the systems and interfaces used to manage data, , analyze current state, and conceive desired future state, and conceive projects needed to close the gap between current state and future goals.
  • Provide a standard common business vocabulary, identify strategic data requirements, outline high level integrated designs to meet these requirements, and align with enterprise strategy and related business architectures.
  • Set data architecture principles, create models of data that enable the implementation of the intended business architecture.
  • Create diagrams showing key data entities and create an inventory of the data needed to implement the architecture vision.
  • Establish best practices for naming, governing, managing, modeling, cleansing, transforming, moving, storing, and searching all data within the organization and perform code reviews.
  • Interact with internal and external resources to understand data requirements, source location, transformation requirements, calculations, etc.
  • Hands on lead for data consolidation and syndication (from various source systems including internal databases, Cloud based data stores, and third-party API/Web Service calls, in batch, near real-time, and real-time).
  • Establish a process for the use of data architecture frameworks to include business data management, business intelligence, data quality and data usage.
  • Manage own learning and contribute to technical skill building of the team.


  • 10+ years of experience in data architecture. (relational and dimensional data modeling, with heavy emphasis on providing data for analytics.)
  • Experience with ETL data transforms and designs with tool such as Talend and SSIS.
  • Hands-on experience architecting complex Data Lake and Data fabric solutions.
  • Data Pipeline, and related understanding of data ingestion, movement, and transformation across Cloud services
  • Hands-on experience with machine learning, artificial intelligence applications and Data Science development languages (Spark, R, Python, etc).
  • Demonstrated knowledge and/or understanding of cloud-based tools and services that support data storage of both Structured and Non-Structured Storage.
  • Hands-on Software development experience using DevOps CI/CD etc.
  • Estimating cloud costs, and identifying associated cost-control mechanisms
  • Deep experience in Software design and development, Master Data Management; Database architecture, Storage and Security
  • Experience with big data technologies such as Azure Synapse, Snowflake, Google BigQuery, etc.
  • Experience deploying solutions to Cloud platforms such as Azure, GCP, etc.
  • Experience extracting/writing data through API calls and web services.
  • Strong SQL skills.
  • Excellent communicator.
  • Highly detail-oriented and intellectually curious.
  • Strong analytical, problem solving, and prioritization skills


  • Experience with data provisioning to machine learning technologies.
  • Experience and knowledge of Oracle EBS database.
  • String understanding of the microservices based architecture.
**U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.**