Skip to main content

SDLC Quality System - Feature Guide

Overview

The SDLC Quality System ensures every line of code shipped through Fiftyknots meets production standards before it reaches your customers. This system combines automated checks, expert reviews, and AI-powered validation to catch issues early - saving you from costly rework and protecting your venture’s reputation. Think of it as your technical quality gate that prevents half-baked implementations from derailing your momentum.

Step-by-Step Guide

  1. Submit your implementation - Navigate to the Delivery page (/delivery/expertsubmission) and upload your completed work with all required documentation. The system generates a manifest of everything you’re submitting.
  2. Track quality verification - Your submission enters the verification pipeline (/admin/verificationpipeline) where automated checks run first. Watch the real-time status on your Developer Dashboard (/developerdashboard) as the system validates code quality, security standards, and requirement traceability.
  3. Review quality assessment - Check the quality report generated for your submission. If automated checks pass, the system routes your work to human verification (HITL) for expert review. You’ll see this status change in your dashboard.
  4. Address feedback if needed - If changes are requested, you’ll land on the Expert Rework View (/expert/expertreworkview) with specific items to fix. Make corrections and resubmit through the same flow.
  5. Receive client review - Once quality checks pass, your work goes to the client for acceptance (/review/clientreview). The Client Review page shows acceptance status and any final feedback before payment release.
  6. Confirm completion - After client approval, the system releases payment from escrow and closes the quality loop. View your payment in Transaction History (/transactionhistory).

Common Questions

Q: What happens if my submission fails automated quality checks?
A: You get immediate feedback on what failed. Fix the issues and resubmit - there’s no penalty for failed automated checks. They exist to help you catch problems before human reviewers see your work.
Q: How long does the verification process take?
A: Automated checks complete in minutes. Human verification (HITL) typically completes within 24-48 hours depending on complexity. You can track progress in real-time on your Developer Dashboard.
Q: Can I dispute a rejection decision?
A: Yes. Use the dispute initiation endpoint from your submission details. The system routes disputes to an independent adjudicator who reviews both your work and the rejection reasoning. This creates an audit trail and ensures fair resolution.
Q: What quality standards does the system enforce?
A: The system validates against the PRD traceability matrix, runs security certification checks, performs code quality analysis, and verifies documentation completeness. Each check maps back to requirements you agreed to at project start.
Q: Do I see quality metrics before submitting?
A: Yes. Run quality checks locally before submission using the quality-check endpoints. This shows you exactly what the automated system will validate, letting you fix issues before formal submission.

Troubleshooting

Issue: Submission stuck in “processing” status
Check the processing status endpoint for your submission. If automated checks hang for more than 30 minutes, contact support through the Support Tickets page (/support/tickets). Include your submission ID and timestamp.
Issue: Quality report shows failures I don’t understand
Navigate to the detailed quality report from your submission page. Each failure includes the specific requirement it violates and links to relevant PRD sections. If still unclear, request clarification through the Sherpa Bridge (/sherpa-bridge/send) - Sherpas can explain technical requirements in plain language.
Issue: Client rejected work but didn’t provide clear feedback
Check the rejection feedback endpoint for your submission. If feedback is vague, you can request clarification through the planning review feedback iteration system. This forces structured feedback tied to specific deliverables.
Issue: Dispute resolution taking too long
Review the dispute audit trail to see current status. The system tracks adjudication timelines - if overdue, it automatically escalates. You can also view overdue adjudications metrics to confirm your case is flagged for priority handling.
PRD Traceability - Every quality check references specific requirements in your PRD. Use the PRD Traceability page (/prdtraceability) to see exactly which requirements each deliverable satisfies. This connects your implementation to the original specification. Planning Review System - Before development starts, this system aligns Sherpa, customer, and implementation team on what “done” means. Strong planning reviews reduce quality issues during submission because everyone agreed on acceptance criteria upfront. Developer Analytics - Track your quality metrics over time on the Developer Analytics page (/developeranalytics). See your pass rate, common failure patterns, and how you compare to peer benchmarks. Use this data to improve submission quality and reduce rework cycles.