Documentation
Complete reference for the gspec workflow and all 11 commands.
Workflow Overview
gspec organizes product specification into a structured workflow. Each stage builds on the previous, giving your AI coding tools progressively deeper context.
Only Define and Build are required. The four Define commands (profile, style, stack, practices) establish your product foundation. The implement command reads all specs and builds software. Everything in between adds optional depth — use what your project needs.
All specs are Markdown files stored in a gspec/ directory in your repository. They're version-controlled, human-readable, and automatically kept in sync through gspec's spec-sync system.
profile Business Strategist
What it does
Defines what your product is, who it serves, and why it exists. This is the foundational spec that all other commands reference — it establishes product identity, target audience, value proposition, and market positioning.
What it produces
A comprehensive product profile saved to gspec/profile.md. Includes product overview, target audience with pain points, value proposition, use cases, competitive landscape, and key messaging.
Example invocation
/profile Key questions
The command guides you through a conversation covering:
- Product name, tagline, and category
- Target audience and their pain points
- Core value proposition and differentiation
- Key use cases and scenarios
- Market context and competitive landscape
Best practices
- Run this first — it's the foundation everything else builds on
- Be specific about your target audience; vague audiences produce vague specs
- Include real pain points you've observed, not hypothetical ones
- Keep the tagline under 10 words and the elevator pitch under 2 sentences
Common pitfalls
- Skipping the profile and jumping to style or stack — downstream specs lack context
- Being too broad with the target audience ("all developers" tells the AI nothing)
- Filling in aspirational features as if they exist today
Related commands
style UI Designer
What it does
Generates a visual design system with colors, typography, spacing, component patterns, and design tokens. Adopts the perspective of a UI Designer to create a cohesive visual language for your product.
What it produces
A comprehensive style guide saved to gspec/style.md. Includes color palette with hex/RGB/HSL values, typography scale, spacing system, component specifications (buttons, cards, forms), theme tokens for dark/light mode, and accessibility guidelines.
Example invocation
/style Key questions
The command guides you through a conversation covering:
- Design vision and visual personality (minimal, bold, playful, etc.)
- Target platforms (web, mobile, desktop)
- Brand colors or color preferences
- Typography preferences
- Dark mode, light mode, or both
Best practices
- Run profile first so the style command can align design with brand personality
- Provide a clear visual personality direction — "clean and minimal" produces better results than "nice"
- Review the generated color palette for accessibility compliance (WCAG AA contrast ratios)
- The style guide is consumed by implement — keep it structured and specific
Common pitfalls
- Generating a style guide without a profile — the design won't reflect your product's personality
- Over-specifying in the conversation and constraining the Designer role too much
- Ignoring accessibility requirements in the output — check contrast ratios
Related commands
stack Solutions Architect
What it does
Defines your technology choices — frameworks, languages, runtime, database, hosting, and key libraries. Documents every decision with rationale so your AI tools make consistent technology choices.
What it produces
A technology stack definition saved to gspec/stack.md. Includes core languages, runtime environment, frontend and backend frameworks, database, hosting, CI/CD, package management, and technology-specific practices.
Example invocation
/stack Key questions
The command guides you through a conversation covering:
- Programming languages and runtime environment
- Frontend framework and styling approach
- Backend framework and API style
- Database and data storage
- Hosting, deployment, and CI/CD
- Key libraries and third-party integrations
Best practices
- Run profile first — the stack should match your audience and scale requirements
- Document the "why" behind each technology choice, not just the "what"
- Include version constraints (e.g., "Node 20 LTS") to prevent AI tools from using incompatible versions
- Note anti-patterns specific to your stack (e.g., "Don't use CSS-in-JS alongside Tailwind")
Common pitfalls
- Listing technologies without rationale — the AI can't make good tradeoff decisions without context
- Forgetting to specify development tools (linters, formatters) that affect code style
- Not mentioning infrastructure constraints that impact architecture decisions
Related commands
practices Engineering Lead
What it does
Establishes development standards — testing conventions, code quality rules, git workflow, error handling, documentation requirements, and your definition of done. Ensures your AI produces code that meets your team's standards.
What it produces
A development practices guide saved to gspec/practices.md. Includes testing standards (unit, E2E, when to write tests), code quality rules, naming conventions, git practices, error handling patterns, security practices, and definition of done.
Example invocation
/practices Key questions
The command guides you through a conversation covering:
- Team size and collaboration model
- Testing philosophy and frameworks
- Code review process and standards
- Git branching strategy and commit conventions
- Error handling and logging patterns
- Security practices and vulnerability prevention
Best practices
- Be specific about testing expectations — "test important logic" is less useful than "unit test all non-trivial functions, E2E for critical user flows"
- Include naming conventions for files, variables, and functions
- Document your definition of done as a checklist the AI can reference
- Mention what NOT to test — this prevents the AI from over-testing trivial code
Common pitfalls
- Writing aspirational practices that your team doesn't actually follow
- Being too vague — "write good tests" gives the AI no guidance
- Forgetting to mention your error handling philosophy (fail fast vs. graceful degradation)
Related commands
research Market Analyst
What it does
Analyzes competitors based on your product profile and produces a competitive analysis with feature matrix, strengths/weaknesses, and gap identification. Helps you understand the landscape before committing to features.
What it produces
A competitive analysis saved to gspec/research.md. Includes competitor profiles, feature comparison matrix, strengths/weaknesses analysis, market gaps, and strategic recommendations.
Example invocation
/research Key questions
The command guides you through a conversation covering:
- Known competitors or similar products
- Specific areas to investigate (features, pricing, market positioning)
- Your product's intended differentiation
Best practices
- Run profile first — research uses your product identity to identify relevant competitors
- Name specific competitors if you know them; the AI will also discover others
- Focus the research on your product's differentiating areas, not every possible feature
- Use research findings to inform feature prioritization
Common pitfalls
- Running research without a profile — the AI doesn't know what market you're in
- Taking the competitive analysis as exhaustive truth — verify key claims
- Feature-stuffing your roadmap based on what competitors have rather than what your audience needs
Related commands
feature Product Manager
What it does
Writes a product requirements document (PRD) for an individual feature. Produces prioritized capabilities (P0/P1/P2) with testable acceptance criteria, scope boundaries, and dependency mapping.
What it produces
A feature PRD saved to gspec/features/<feature-name>.md. Includes overview, user stories, scope (in/out/deferred), prioritized capabilities with acceptance criteria, dependencies, assumptions and risks, and success metrics.
Example invocation
/feature "Task board with drag-and-drop columns" Key questions
The command guides you through a conversation covering:
- Feature description and user need it addresses
- Target users and their workflows
- Scope boundaries — what's in and what's explicitly out
- Edge cases and error scenarios
- Dependencies on other features or systems
Best practices
- Describe the user need, not the solution — let the Product Manager role shape the requirements
- Be explicit about scope boundaries to prevent feature creep
- P0 capabilities should be the minimum for the feature to be useful
- Keep individual features focused — use epic to break down large initiatives
Common pitfalls
- Describing implementation details instead of user needs in the feature description
- Making everything P0 — if everything is critical, nothing is prioritized
- Writing features that are too large — if it has more than 10-12 capabilities, consider using epic instead
Related commands
epic Product Manager
What it does
Breaks down a large initiative into multiple focused feature PRDs with dependency mapping. Use this when a body of work is too large for a single feature PRD.
What it produces
Multiple feature PRDs saved to gspec/features/, plus an epic overview document. Each generated feature PRD follows the same structure as a standalone feature command output.
Example invocation
/epic "Complete user authentication system with OAuth, email/password, and role-based access control" Key questions
The command guides you through a conversation covering:
- The overall initiative and its goals
- How the initiative should be decomposed into features
- Dependencies between the resulting features
- Priority ordering for implementation
Best practices
- Use epic when your initiative naturally decomposes into 3+ distinct features
- Provide context about the overall goal, not just a list of features you want
- Review the generated dependency map before implementing — build dependencies first
- Each generated feature PRD can be independently implemented with /implement
Common pitfalls
- Using epic for something that's really just one feature — you'll get artificially split PRDs
- Not reviewing the dependency mapping — implementing features out of order causes rework
- Trying to specify everything at epic level instead of letting each feature PRD handle its own details
Related commands
architect Technical Architect
What it does
Designs the technical architecture — project structure, data model, API design, component architecture, and environment configuration. Reads your stack, practices, and feature PRDs to produce a comprehensive blueprint.
What it produces
A technical architecture document saved to gspec/architecture.md. Includes project structure with directory layout, data model, API design, component architecture, environment configuration, CI/CD setup, and a technical gap analysis.
Example invocation
/architect Key questions
The command guides you through a conversation covering:
- Architectural patterns and preferences
- Data model requirements
- API style and endpoint structure
- Component organization and relationships
- Deployment and environment concerns
Best practices
- Run after defining features — the architect needs requirements to design against
- Run after stack — the architect needs technology constraints
- Review the gap analysis section carefully — it identifies contradictions between specs
- The architecture document is the most influential spec for implement — invest time reviewing it
Common pitfalls
- Running architect before feature PRDs exist — produces a generic architecture
- Skipping the gap analysis review — contradictions between specs cause confused implementation
- Over-constraining the architecture for a simple project — not every project needs a detailed data model
Related commands
-
stack— Architect reads stack for technology constraints and patterns -
feature— Architect reads feature PRDs to design the technical structure -
analyze— Run analyze after architect to catch cross-spec contradictions -
implement— Implement reads architecture for project structure, data model, and APIs
analyze Quality Analyst
What it does
Cross-references all existing gspec documents and identifies contradictions, gaps, and inconsistencies between them. Run this before implement to ensure your specs tell a coherent story.
What it produces
An analysis report highlighting contradictions between specs, missing information, and recommended resolutions. Updates are applied directly to the relevant spec files.
Example invocation
/analyze Key questions
The command guides you through a conversation covering:
- This command is non-interactive — it reads all specs and produces findings automatically
Best practices
- Run after architect and before implement for the most comprehensive analysis
- Review findings and accept or reject each recommendation
- Re-run after making significant changes to any spec
- Especially useful when multiple people have contributed to different specs
Common pitfalls
- Running analyze with only one or two specs — it needs multiple specs to find cross-references
- Blindly accepting all suggestions without reviewing context
- Skipping analyze when specs were written at different times — drift is common
Related commands
implement Senior Developer
What it does
Reads all existing gspec documents and implements your software with full project context. Synthesizes profile, style, stack, practices, architecture, and feature PRDs to produce code that matches your specifications.
What it produces
Working code committed to your repository. Updates feature PRD capability checkboxes from [ ] to [x] as each capability is implemented. Produces code that follows your stack, practices, and style specifications.
Example invocation
/implement Key questions
The command guides you through a conversation covering:
- Which feature(s) to implement (if multiple exist)
- Any additional context or constraints for this implementation session
Best practices
- At minimum, have the four Define specs (profile, style, stack, practices) before running implement
- Feature PRDs give the best results — implement works from prioritized capabilities, not vague descriptions
- Run implement incrementally — one feature at a time, reviewing output between runs
- Check the capability checkboxes in your feature PRD to track progress
- If output quality is low, the issue is usually in the specs — improve your specs, not your prompts
Common pitfalls
- Running implement with no specs — produces generic code with no project context
- Implementing everything at once instead of one feature at a time
- Not reviewing the generated code — implement is a tool, not an autopilot
- Blaming implement for poor output when the real issue is vague or contradictory specs
Related commands
-
profile— Implement reads profile for product context and audience -
style— Implement reads style for all visual and UI decisions -
stack— Implement reads stack for technology choices and patterns -
practices— Implement reads practices for code quality and testing standards -
feature— Implement builds against feature PRD capabilities -
architect— Implement reads architecture for project structure and data model