clarify note on platforms
This commit is contained in:
parent
8a5c4869bd
commit
b0d047425f
@ -37,6 +37,8 @@ uv run train.py
|
||||
|
||||
If the above commands all work ok, your setup is working and you can go into autonomous research mode.
|
||||
|
||||
**Platforms support**. This code currently requires that you have a single NVIDIA GPU. In principle it is quite possible to support CPU, MPS and other platforms but this would also bloat the code. I'm not 100% sure that I want to take this on personally right now. The code is just a demonstration and I don't know how much I'll support it going forward. People can reference (or have their agents reference) the full/parent nanochat repository that has wider platform support and shows the various solutions (e.g. a Flash Attention 3 kernels fallback implementation, generic device support, autodetection, etc.), feel free to create forks or discussions for other platforms and I'm happy to link to them here in the README in some new notable forks section or etc.
|
||||
|
||||
## Running the agent
|
||||
|
||||
Simply spin up your Claude/Codex or whatever you want in this repo (and disable all permissions), then you can prompt something like:
|
||||
|
||||
Loading…
Reference in New Issue
Block a user