Quality Assurance

DataJoint and DataJoint Elements serve as a framework and starting points for numerous new projects, setting the standard of quality for data architecture and software design. To ensure higher quality, the following policies have been adopted into the software development lifecycle (SDLC).

Coding Standards

When writing code, the following principles should be observed.

Automated Testing

All components and their revisions must include appropriate automated software testing to be considered for release. The core framework must undergo thorough performance evaluation and comprehensive integration testing.

Generally, this includes tests related to:

Code Reviews

When introducing new code to the code base, the following will be required for acceptance by DataJoint core team into the main code repository.

Release Process

Upon satisfactory adherence to the above Coding Standards, Automated Testing, and Code Reviews:

For external research teams that reach out to us, we will provide engineering support to help users adopt the updated software, collect feedback, and resolve issues following the processes described in the section below. If the updates require changes in the design of the database schema or formats, a process for data migration will be provided upon request.

User Feedback & Issue Tracking

All components will be organized in GitHub repositories with guidelines for contribution, feedback, and issue submission to the issue tracker. For more information on the general policy around issue filing, tracking, and escalation, see the DataJoint Open-Source Contribute policy. For research groups that reach out to us, our team will work closely to collect feedback and resolve issues. Typically issues will be prioritized based on their criticality and impact. If new feature requirements become apparent, this may trigger the creation of a separate workflow or a major revision of an existing workflow.