• Data Engineering Lead

  • About Excella

    Excella is a technology consulting firm serving commercial, non-profit, and federal clients in the Washington, DC area. Excella builds innovative custom software solutions with a strong focus on Agile engineering practices. We believe that great work leads to great things –- for our clients and our employees. We are growing fast and need passionate, innovative people who love working with technology and are ready to make an impact.


    Here's what you can expect from us:

    • We care about our employees. In fact, The Washington Post and The Washington Business Journal consistently rank us as a "Best Place to Work."
    • You'll work with great people who love what they do: our team includes published authors, certified trainers, and internationally renowned speakers.
    • We have a "bring your own device" workplace and will share the cost of a new computer of your choice -- Mac or PC. It's up to you.
    • We'll invest in your career by providing 3 days of paid professional development every year, including travel and registration fees to attend classes and conferences, in addition to tuition assistance for degrees and certifications.
    • Starting day one, every employee is bonus eligible and receives 17 days of paid vacation.
    • You can bike, drive, or metro to work -- our commute reimbursement plan has you covered.
    • We cater dinner bi-monthly, and always have healthy snacks available!
    • You'll have fun! We hold monthly social events all year long, including a summer getaway for you and your family.


    The Data Engineering Lead manages the design and build of modern analytics environments that comprise of raw data stores (data lakes) and cleansed data repositories (data warehouse, datamarts) and populated by batch or streaming data pipelines. The Data Engineering Lead creates a robust, sustainable and flexible design and leads the technical delivery using Agile delivery frameworks like Scrum or Kanban. Should be a proven leader and mentor skilled at enabling and serving the development team.


    • Designing, building, testing and maintaining highly scalable data architectures
    • Building data repositories such as: data warehouses, data lakes, data marts, etc.
    • Developing and managing data processes to ensure that data is available and usable
    • Automation of data pipelines, platforms and testing
    • Managing and monitoring data quality
    • Researching data acquisition and evaluating suitability
    • Integration of data management solutions into client environment 
    • Actively managing risks to data and ensuring there is a data recovery plan
    • Working closely with data architects, data scientists, and data visualization developers to deliver sustainable data solutions


    • 7+ years relevant professional work experience.
    • Minimum of 5 years of hands on development experience in a technical role.
    • Explaining technical concepts in a business value context.
    • Technically savvy, entrepreneurial spirit who thrives in environments that reward self-initiative and resourcefulness.
    • Experience and expertise in the following:
      • Common data warehouse architectural patterns (Star Schema, data integration patterns, ABAC, data quality frameworks etc.)
      • Data lake design patterns and technology options (schema on read, metadata capture, search framework)
      • Use of scripting languages, preferably Python
      • Use of data integration tools such as Talend, SSIS, or Azure Data Factory
    • Project experience using the Scrum or Kanban framework.
    • Aptitude and desire for learning new technologies.
    • Ability to teach technical concepts.
    • Skilled in facilitating discussion in order to solve problems with project stakeholders.
    • Professionalism; to include written and oral communication – the ability to communicate collaboratively in front of a whiteboard. An ability to understand your audience and adjust your communication style to fit 


    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed