Data Engineer in Charlotte at Vaco

Date Posted: 9/10/2020

Job Snapshot

Job Description

**U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.**

Vaco is looking for a critically thinking Data Engineer candidate that will play a pivotal role in architecting, building, and monitoring data pipelines that will be leveraged by the data scientists, developers, and product managers on the data analytics team.


  • Work cross-functionally with analytics product managers, data scientists, other data engineers, software developers, and analytics engineers to glean insights and needs from the business, brainstorm best solutions, build/leverage data pipelines, build and test statistical models, productionalize code, and deploy models
  • Critically think through business needs and architect data pipelines and data flows that meet those needs
  • Leverage Python, SQL, and data pipeline tools, such as Airflow, to build complex ETL processes for moving, cleansing, structuring, and transformation of internal and external data
  • Test and perform quality control checks on processes to make sure they are meeting the needs of the business.
  • Engage with product managers, data scientists, analytics engineers, and other data engineers to make sure the business needs are being met with the data pipelines/flows.
  • Monitor and troubleshoot existing data pipelines and infrastructure (i.e. databases and servers), establish automated alerts and automated re-triggering of processes, and document architecture of data pipelines and data flows
  • Assist data scientists in an ad-hoc manner and become upskilled in data science
  • Assist analytics engineers in an ad-hoc manner and become upskilled in model implementation, deployment, and integration


  • Creative problem solver who thinks through how processes fit together and affect each other (i.e. knows and takes ownership of the entire ecosystem)
  • Proactive individual who continuously strives for process improvement and thinks about what could go wrong and plans/builds solutions, alerts, automated re-triggers, etc accordingly
  • SSIS (SQL Server Integration Services) or some sort of ETL and process flow experience
  • Experience with database/data/schema/architecture/pipeline design. More than just dumping data.
  • Apache Airflow experience a plus
  • Works well in teams (i.e. does not like working in a silo)
  • Able to think and problem solve independently or as a team with limited requirements or robust requirements
  • Strong business acumen
  • Expertise in programming, data flow and data processes, data cleansing, data structure, data quality, and data analysis
  • Intermediate/Expert programming expeirence in Python & SQL
  • Proficient in Microsoft Office Products.
  • Experience with cloud platforms (preferably Azure but AWS acceptable as well)
  • Some level of comfort in a *NIX environment.
  • Excellent written, verbal, and presentation-based communication skills.
  • Experience with MicroStrategy and/or other reporting platforms a plus.
  • Experience with some type of source code management (ex: Git)
  • 3 - 5 years minimum of relevant work experience.

**U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.**