You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The AI Context Convention is a standardized method for embedding rich contextual information within codebases to enhance AI-assisted development. This specification outlines a flexible, language-agnostic approach to providing both structured and unstructured context at various levels of a project.
8
+
The AI Context Convention is a standardized method for embedding rich contextual information within codebases to enhance AI-assisted development. This specification outlines a flexible, language-agnostic approach to providing both structured and unstructured context at various levels of a project, catering to the needs of different team roles.
9
9
10
10
## 2. Key Principles
11
11
@@ -55,109 +55,251 @@ project_root/
55
55
56
56
### 4.1 Markdown Format (Default)
57
57
58
-
Markdown files (.context.md) are the default and recommended format. They can include an optional YAML front matter for structured data, followed by free-form Markdown content.
58
+
Markdown files (.context.md) are the default and recommended format. They can include an optional YAML front matter for structured data, followed by free-form Markdown content. The structured data should now include role-specific sections.
59
59
60
60
Example:
61
61
62
62
```markdown
63
63
---
64
-
description: Core application logic
64
+
project-name: MyAwesomeProject
65
+
version: 1.0.0
66
+
description: A revolutionary web application
67
+
main-technologies:
68
+
- Node.js
69
+
- React
70
+
- MongoDB
65
71
conventions:
66
-
- Use camelCase for variable names
72
+
- Use consistent naming conventions within each file type
67
73
- Each function should have a single responsibility
68
-
aiPrompts:
74
+
ai-prompts:
69
75
- Focus on performance optimizations
70
76
- Suggest ways to improve error handling
71
-
fileContexts:
72
-
auth.js:
73
-
description: Authentication module
74
-
aiPrompts:
75
-
- Review security measures
76
-
- Suggest improvements for password hashing
77
-
data.js:
78
-
description: Data processing module
79
-
conventions:
80
-
- Use async/await for all database operations
77
+
architecture:
78
+
style: Microservices
79
+
main-components:
80
+
- Auth Service
81
+
- User Service
82
+
- Data Processing Service
83
+
data-flow:
84
+
- Client -> API Gateway -> Services -> Database
85
+
development:
86
+
setup-steps:
87
+
- Install Node.js v14+
88
+
- Run `npm install` in each service directory
89
+
- Set up MongoDB instance
90
+
build-command: npm run build
91
+
test-command: npm test
92
+
business-requirements:
93
+
key-features:
94
+
- User authentication
95
+
- Real-time data processing
96
+
- Mobile-responsive UI
97
+
target-audience: Small to medium-sized businesses
98
+
success-metrics:
99
+
- User adoption rate
100
+
- System response time
101
+
- Data processing accuracy
102
+
quality-assurance:
103
+
testing-frameworks:
104
+
- Jest
105
+
- Cypress
106
+
coverage-threshold: 80%
107
+
performance-benchmarks:
108
+
- API response time < 200ms
109
+
- Database query time < 100ms
110
+
deployment:
111
+
platform: AWS
112
+
cicd-pipeline: GitHub Actions
113
+
staging-environment: dev.myawesomeproject.com
114
+
production-environment: myawesomeproject.com
81
115
---
82
116
83
-
# Core Application Logic
117
+
# MyAwesomeProject
84
118
85
-
This directory contains the core logic for our application, including user authentication and data processing.
119
+
This document provides comprehensive context for the MyAwesomeProject, a revolutionary web application designed to streamline business processes.
86
120
87
-
## Authentication Module (auth.js)
121
+
## Architecture Overview
88
122
89
-
The authentication module handles user login, registration, and password reset functionality. It uses bcrypt for password hashing and JWT for session management.
123
+
MyAwesomeProject follows a microservices architecture, consisting of the following main components:
90
124
91
-
Key considerations:
92
-
- OWASP security best practices
93
-
- GDPR compliance for data handling
125
+
1. Auth Service: Handles user authentication and authorization.
126
+
2. User Service: Manages user profiles and preferences.
127
+
3. Data Processing Service: Processes and analyzes business data in real-time.
94
128
95
-
## Data Processing Module (data.js)
129
+
The system uses an API Gateway to route requests to appropriate services, ensuring scalability and maintainability.
96
130
97
-
The data processing module is responsible for all database interactions and data transformations. It uses an ORM for database operations and implements caching for improved performance.
131
+
## Development Guidelines
132
+
133
+
- Follow the conventions listed in the front matter.
134
+
- Use feature branches and pull requests for all changes.
135
+
- Write unit tests for all new features and bug fixes.
136
+
- Document all public APIs using JSDoc comments.
137
+
138
+
## Business Context
139
+
140
+
The primary goal of MyAwesomeProject is to provide small to medium-sized businesses with a powerful tool for real-time data analysis and visualization. Key features include:
141
+
142
+
- Secure user authentication
143
+
- Real-time data processing with customizable dashboards
144
+
- Mobile-responsive design for on-the-go access
145
+
146
+
Success will be measured by user adoption rates, system performance metrics, and data processing accuracy.
147
+
148
+
## Quality Assurance
149
+
150
+
Our QA process ensures high-quality, reliable software through:
151
+
152
+
- Comprehensive unit and integration testing using Jest
153
+
- End-to-end testing with Cypress
154
+
- Continuous integration and deployment via GitHub Actions
155
+
- Regular performance testing and optimization
156
+
157
+
## Deployment and Operations
158
+
159
+
MyAwesomeProject is deployed on AWS using a robust CI/CD pipeline:
160
+
161
+
1. Developers push code to GitHub
162
+
2. GitHub Actions run tests and build the application
163
+
3. Successful builds are deployed to the staging environment
164
+
4. After approval, changes are promoted to production
165
+
166
+
Monitoring and logging are handled through AWS CloudWatch and ELK stack.
98
167
99
-
Performance considerations:
100
-
- Optimize database queries
101
-
- Implement efficient data structures for in-memory operations
102
168
```
103
169
104
170
### 4.2 YAML Format
105
171
106
-
YAML format (.context.yaml or .context.yml) can be used as an alternative to Markdown for purely structured data.
172
+
YAML format (.context.yaml or .context.yml) should now include the expanded role-specific sections and use kebab-case for key names.
107
173
108
174
Example:
109
175
110
176
```yaml
111
-
description: Core application logic
177
+
project-name: MyAwesomeProject
178
+
version: 1.0.0
179
+
description: A revolutionary web application
180
+
main-technologies:
181
+
- Node.js
182
+
- React
183
+
- MongoDB
112
184
conventions:
113
-
- Use camelCase for variable names
185
+
- Use consistent naming conventions within each file type
114
186
- Each function should have a single responsibility
115
-
aiPrompts:
187
+
ai-prompts:
116
188
- Focus on performance optimizations
117
189
- Suggest ways to improve error handling
118
-
fileContexts:
119
-
auth.js:
120
-
description: Authentication module
121
-
aiPrompts:
122
-
- Review security measures
123
-
- Suggest improvements for password hashing
124
-
data.js:
125
-
description: Data processing module
126
-
conventions:
127
-
- Use async/await for all database operations
190
+
architecture:
191
+
style: Microservices
192
+
main-components:
193
+
- Auth Service
194
+
- User Service
195
+
- Data Processing Service
196
+
data-flow:
197
+
- Client -> API Gateway -> Services -> Database
198
+
development:
199
+
setup-steps:
200
+
- Install Node.js v14+
201
+
- Run `npm install` in each service directory
202
+
- Set up MongoDB instance
203
+
build-command: npm run build
204
+
test-command: npm test
205
+
business-requirements:
206
+
key-features:
207
+
- User authentication
208
+
- Real-time data processing
209
+
- Mobile-responsive UI
210
+
target-audience: Small to medium-sized businesses
211
+
success-metrics:
212
+
- User adoption rate
213
+
- System response time
214
+
- Data processing accuracy
215
+
quality-assurance:
216
+
testing-frameworks:
217
+
- Jest
218
+
- Cypress
219
+
coverage-threshold: 80%
220
+
performance-benchmarks:
221
+
- API response time < 200ms
222
+
- Database query time < 100ms
223
+
deployment:
224
+
platform: AWS
225
+
cicd-pipeline: GitHub Actions
226
+
staging-environment: dev.myawesomeproject.com
227
+
production-environment: myawesomeproject.com
128
228
```
129
229
130
230
### 4.3 JSON Format
131
231
132
-
JSON format (.context.json) can be used for purely structured data when preferred.
232
+
JSON format (.context.json) should also include the expanded role-specific sections. Note that JSON doesn't support kebab-case for key names, so we'll use camelCase as it's a common convention in JSON.
133
233
134
234
Example:
135
235
136
236
```json
137
237
{
138
-
"description": "Core application logic",
238
+
"projectName": "MyAwesomeProject",
239
+
"version": "1.0.0",
240
+
"description": "A revolutionary web application",
241
+
"mainTechnologies": [
242
+
"Node.js",
243
+
"React",
244
+
"MongoDB"
245
+
],
139
246
"conventions": [
140
-
"Use camelCase for variable names",
247
+
"Use consistent naming conventions within each file type",
141
248
"Each function should have a single responsibility"
142
249
],
143
250
"aiPrompts": [
144
251
"Focus on performance optimizations",
145
252
"Suggest ways to improve error handling"
146
253
],
147
-
"fileContexts": {
148
-
"auth.js": {
149
-
"description": "Authentication module",
150
-
"aiPrompts": [
151
-
"Review security measures",
152
-
"Suggest improvements for password hashing"
153
-
]
154
-
},
155
-
"data.js": {
156
-
"description": "Data processing module",
157
-
"conventions": [
158
-
"Use async/await for all database operations"
159
-
]
160
-
}
254
+
"architecture": {
255
+
"style": "Microservices",
256
+
"mainComponents": [
257
+
"Auth Service",
258
+
"User Service",
259
+
"Data Processing Service"
260
+
],
261
+
"dataFlow": [
262
+
"Client -> API Gateway -> Services -> Database"
263
+
]
264
+
},
265
+
"development": {
266
+
"setupSteps": [
267
+
"Install Node.js v14+",
268
+
"Run `npm install` in each service directory",
269
+
"Set up MongoDB instance"
270
+
],
271
+
"buildCommand": "npm run build",
272
+
"testCommand": "npm test"
273
+
},
274
+
"businessRequirements": {
275
+
"keyFeatures": [
276
+
"User authentication",
277
+
"Real-time data processing",
278
+
"Mobile-responsive UI"
279
+
],
280
+
"targetAudience": "Small to medium-sized businesses",
281
+
"successMetrics": [
282
+
"User adoption rate",
283
+
"System response time",
284
+
"Data processing accuracy"
285
+
]
286
+
},
287
+
"qualityAssurance": {
288
+
"testingFrameworks": [
289
+
"Jest",
290
+
"Cypress"
291
+
],
292
+
"coverageThreshold": "80%",
293
+
"performanceBenchmarks": [
294
+
"API response time < 200ms",
295
+
"Database query time < 100ms"
296
+
]
297
+
},
298
+
"deployment": {
299
+
"platform": "AWS",
300
+
"cicdPipeline": "GitHub Actions",
301
+
"stagingEnvironment": "dev.myawesomeproject.com",
302
+
"productionEnvironment": "myawesomeproject.com"
161
303
}
162
304
}
163
305
```
@@ -314,7 +456,7 @@ documentation:
314
456
315
457
## 9. Conclusion
316
458
317
-
The AI Context Convention provides a flexible, standardized approach to enriching codebases with contextual information for AI models. By adopting this convention, development teams can enhance AI-assisted workflows, improving code quality and development efficiency across projects of any scale or complexity. The addition of the `.contextdocs` file further enriches the available context by allowing the incorporation of external documentation, ensuring that AI models have access to comprehensive information about the project and its dependencies.
459
+
The AI Context Convention provides a flexible, standardized approach to enriching codebases with contextual information for AI models. By adopting this convention and including role-specific information, development teams can enhance AI-assisted workflows, improving code quality and development efficiency across projects of any scale or complexity. The addition of role-specific guidelines and consistent naming conventions ensures that AI models have access to comprehensive, relevant, and well-structured information tailored to different aspects of the software development lifecycle.
0 commit comments