• Data Integration (ETL) Developer

  • About Excella

    Excella Consulting is a leading provider of Agile software development and data and analytics solutions to clients in the federal, commercial and non-profit sectors. We believe that great work leads to great things –- our experts measure success by the positive impact we make on our clients, community, and colleagues. We are growing fast and need passionate, innovative people who love working with technology and are ready to make an impact.


    Here's what you can expect from us:


    • We care about our employees. In fact, The Washington Post and The Washington Business Journal consistently rank us as a "Best Place to Work."
    • You'll work with great people who love what they do: our team includes published authors, certified trainers, and internationally renowned speakers.
    • We have a "bring your own device" workplace and will share the cost of a new computer of your choice -- Mac or PC. It's up to you.
    • We'll invest in your career by providing 3 days of paid professional development every year, including travel and registration fees to attend classes and conferences, in addition to tuition assistance for degrees and certifications.
    • Starting day one, every employee is bonus eligible and receives 17 days of paid vacation.
    • You can bike, drive, or metro to work -- our commute reimbursement plan has you covered.
    • We cater dinner once a month, and always have healthy snacks available!
    • You'll have fun! We hold monthly happy hours and events all year long, including a summer weekend getaway for you and your family.


    The Data Integration (ETL) Developer is responsible for developing applications and processes that are required to extract, transform, cleanse, and manage data and metadata to be loaded into a data warehouse, data mart, or data lake.


    • Works with Technical Lead and Data Solutions Architect to understand solution vision and create data integration design
    • Develops processes to integrate data from multiple sources into accessible and performant structures that support analytic needs.
    • Defines and implements data quality logic associated with data processing flows.
    • Adheres to data security requirements at all stages of data processing.
    • Supports functional, integration and user acceptance testing.
    • Automates data processing flows in support of Continuous Integration and Continuous Deployment
    • Follows industry standards and best practices.
    • Interacts directly with client stakeholders who are in business and/or technical roles.


    Technical Requirements:


    • Proficient in use of SQL and SQL scripts
    • Proficient in data warehousing concepts and constructs, including star schema data models
    • Familiarity with Hadoop-based Big Data solutions (e.g. data lakes)
    • Proficient working with large data volumes (e.g. millions to billions of records)
    • Proficient performance tuning SQL, including use of explains plans, partitions, hints, etc.
    • Experience working with source code management tools (e.g. Github)
    • Experience working in a team using Agile delivery frameworks (e.g. Scrum, Kanban)
    • Experience developing analytics solutions hosted in the Cloud is preferred (e.g. AWS, Azure)
    • Experience working with Map Reduce and/or Spark is preferred
    • Experience working with CI tools (e.g. Jenkins, TeamCity) is preferred

    Non-Technical Requirements:


    • A passion for enabling analytics solutions and learning new techniques.
    • Able to work in a fast paced development environment as part of a small, cross functional team (5-10 People)
    • Self-motivated and self-managing
    • Must be able to supervise and/or mentor other developers
    • Strong interpersonal and verbal communication skills; proven experience with direct client interaction is preferred
    • Strong documentation skills; able to create technical process flow diagrams and solution maintenance documentation


    • B.S. degree or equivalent work experience in computer programming, analysis, and design is required. Education should include both formal and work related experiences
    • 5+ years of experience working with relational databases  (e.g. Oracle, SQL Server, PostgreSQL)
    • 5+ years of experience designing and building data processing workflows using tools such as SSIS, Talend, Pentaho or Informatica


    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed