diff --git a/doc/README.md b/doc/README.md
index b8b3742..28a5851 100644
--- a/doc/README.md
+++ b/doc/README.md
@@ -16,6 +16,7 @@ This section contains links to various documentation sources and is a helpful in
|Inf2 Instance Details |Helpful overview links for the Inferentia2 Instance and associated accelerators |
- [AWS Landing Page](https://aws.amazon.com/ai/machine-learning/inferentia/)
- [Instance Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/inf2-arch.html#aws-inf2-arch)
- [Chip Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/inferentia2.html#inferentia2-arch)
- [Core Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/neuron-core-v2.html#neuroncores-v2-arch)
|
|Trn1 Instance Details |Similar overview links for Trn1 instances and acclerators |- [AWS Landing Page](https://aws.amazon.com/ai/machine-learning/trainium/)
- [Instance Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/trn1-arch.html#aws-trn1-arch)
- [Chip Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/trainium.html#trainium-arch)
- [Core Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/neuron-core-v2.html#neuroncores-v2-arch)
|
|Trn2 Instance Details |Similar overview links for Trn2 instances and acclerators |- [Youtube Launch Video](https://www.youtube.com/watch?v=Bteba8KLeGc)
- [Instance Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/trn2-arch.html#aws-trn2-arch)
- [Chip Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/trainium2.html#trainium2-arch)
- [Core Details](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/neuron-core-v3.html#neuroncores-v3-arch)
|
+| Instance Service Quotas | Understand what service quotas are, how they apply to Inferentia and Trainium instances and endpoints, and have an example of what quotas would be appropriate for a POC. |[Inferentia and Trainium Service Quotas](https://repost.aws/articles/ARgmEMvbR6Re200FQs8rTduA/inferentia-and-trainium-service-quotas) |
|Software Overview - General |Overview Video of Trainium Software Stack |[Video](https://www.youtube.com/watch?v=vaqj8XQfqwM&t=806s) |
|Software Overview - Framework |Application Frameworks for developing on Neuron. Torch-NeuronX for small model inference and training, NxD for Distributed modeling primitives, NxDI - a higher abstraction library for inference and NxDT a corresponding abstraction for training. |- Torch-NeuronX ([Training](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/frameworks/torch/torch-neuronx/programming-guide/training/pytorch-neuron-programming-guide.html#pytorch-neuronx-programming-guide), [Inference](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/frameworks/torch/torch-neuronx/programming-guide/inference/trace-vs-xla-lazytensor.html))
- [NxD](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/libraries/neuronx-distributed/developer-guide.html)
- [NxD-T](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/libraries/nxd-training/overview.html#nxd-training-overview)
- [NxD-I](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/libraries/nxd-inference/nxdi-overview.html#nxdi-overview)
|
|Software Overview - ML Libraries |ML libraries which offer another interface for deploying to trn/inf. Optimum-Neuron provides and interface between transformers and AWS Accelerators. AXLearn is a training library built on top of JAX and XLA. |[Optimum Neuron](https://huggingface.co/docs/optimum-neuron/index) [AXLearn](https://github.com/apple/axlearn) |