Data Architect | $130k | Direct-Hire | Richmond, VA in Richmond, VA at Vaco

Date Posted: 9/1/2020

Job Snapshot

Job Description



What you'll do:

    • Work directly with our clients and internal executives in presale and client delivery environments to architect sustainable data solutions, design data engineering delivery plans and mentor our data engineers build and maintain these solutions.
    • Leverage your advanced software engineering expertise including experience analyzing transactional and reporting system data and available cloud and on-prem storage and retrieval options. Design interactions with large-scale processing systems, develop real-time integrations leveraging RESTful APIs, and develop scalable data structures to address our client's most pressing data engineering needs.
    • Partner with clients and provide leadership to our data engineering team members in developing, constructing, testing and maintaining first-class data architectures.
    • Share your experience supporting real-time data, data streaming, scalability of the platform and management of large volumes of data.
    • Use your mastery of a variety of languages and tools to marry systems and data while recommending ways to improve current systems and data reliability, efficiency and quality.
    • Provide thought leadership internally and externally, always staying abreast of the very latest tools and technology available.
    • Leverage your deep expertise in data engineering best practices, serving as mentor and coach to team members, sharing the expertise throughout our team.
    • Create sophisticated analytics programs, machine learning and statistical methods to prepare data for use in predictive analytics.
    • Bring a focus to automating our clients' work through the use of the solutions you develop.
    • Collaborate with API developers to build data-driven microservices for our clients


Preferred Skills (but not limited to):

    • Designing, developing, scaling, and maintaining data using Spark, Kafka, Hive, Python or Scala.
    • Experience with modern ETL and workflow capabilities such as Apache Airflow, Luigi and Jenkins.
    • Experience designing and implementing SQL and NoSQL systems of record.
    • Experience designing and implementing various data hub architectures, supporting a variety of business use cases.
    • Hadoop Developer
    • Putting modern data platforms into use, including platform as a service variant.
    • Providing expertise with humility; communicating complex ideas with clients?and technical staff.
    • Using Git or Github in a CI/CD development workflow.
    • Developing microservices using languages like Java, Python or JavaScript?and using REST?APIs.
    • Writing?effective technical documentation and thought leadership artifacts (best practices, blogs, client documentation).
    • Automating deployments using DevOps tools like Docker, Ansible, Terraform, or Kubernetes.

Job Requirements

Data Architect, Hadoop, Spark, Kafka, Hive, Python or Scala, ETL, SQL, Rest APIs, Docker, Ansible, Terraform, or Kubernetes.