You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### Create and activate conda environment with requirements installed.
15
15
For scNiche, the Python version need is over 3.9. If you have already installed a lower version of Python, consider installing Anaconda, and then you can create a new environment.
The version of PyTorch and DGL should be suitable to the CUDA version of your machine. You can find the appropriate version on the [PyTorch](https://pytorch.org/get-started/locally/) and [DGL](https://www.dgl.ai/) website.
31
33
32
-
### Install other requirements
33
-
```
34
-
cd scNiche-main
35
-
pip install -r requirements.txt
36
-
```
34
+
37
35
### Install scNiche
38
36
```
39
37
python setup.py build
40
38
python setup.py install
41
39
```
42
40
43
41
## Tutorials (identify cell niches)
44
-
scNiche requires the single-cell spatial omics data (stored as `.h5ad` format) as input, where cell population label of each cell needs to be provided.
42
+
#### - Spatial proteomics data or single-cell spatial transcriptomics data
43
+
44
+
By default, scNiche requires the single-cell spatial omics data (stored as `.h5ad` format) as input, where cell population label of each cell needs to be provided.
45
45
46
46
Here are examples of scNiche on simulated and biological datasets:
47
47
*[Demonstration of scNiche on the simulated data](tutorial/tutorial_simulated.ipynb)
48
-
*[Demonstration of scNiche on the mouse spleen CODEX data](tutorial/tutorial_spleen.ipynb)
49
-
*[Demonstration of scNiche on the human upper tract urothelial carcinoma (UTUC) IMC data](tutorial/tutorial_utuc.ipynb)
48
+
*[Demonstration of scNiche on the mouse V1 neocortex STARmap data](tutorial/tutorial_STARmap.ipynb)
49
+
50
+
51
+
scNiche also provides a subgraph-based batch training strategy to scale to large datasets and multi-slices:
52
+
53
+
1. Batch training strategy of scNiche for single-slice:
54
+
*[Demonstration of scNiche on the mouse spleen CODEX data](tutorial/tutorial_spleen.ipynb) (over 80,000 cells per slice)
55
+
56
+
2. Batch training strategy of scNiche for multi-slices:
57
+
*[Demonstration of scNiche on the human upper tract urothelial carcinoma (UTUC) IMC data](tutorial/tutorial_utuc.ipynb) (containing 115,060 cells from 16 slices)
58
+
*[Demonstration of scNiche on the mouse frontal cortex and striatum MERFISH data](tutorial/tutorial_MERFISH.ipynb) (containing 376,107 cells from 31 slices)
59
+
60
+
61
+
#### - Low-resolution spatial transcriptomics data
62
+
We here take 4 slices from the same donor of the [human DLPFC 10X Visium data](http://spatial.libd.org/spatialLIBD/) as an example.
63
+
64
+
In contrast to spatial proteomics data, which usually contain only a few dozen proteins, these spatial transcriptomics data can often measure tens of thousands of genes,
65
+
with potential batch effects commonly present across tissue slices from different samples.
66
+
Therefore, dimensionality reduction and batch effect removal need to be performed on the molecular profiles of the cells and their neighborhoods before run scNiche.
67
+
We used [scVI](https://github.com/scverse/scvi-tools) by defalut, however, simple PCA dimensionality reduction or other deep learning-based integration methods like [scArches](https://github.com/theislab/scarches) are also applicable.
68
+
69
+
Furthermore, cell type labels are usually unavailable for these spatial transcriptomics data. As alternatives,
70
+
we can:
71
+
1. Use the `deconvolution results of spots` as a substitute view to replace the `cellular compositions of neighborhoods`.
72
+
We used the human middle temporal gyrus (MTG) scRNA-seq data by [Hodge et al.](https://doi.org/10.1038/s41586-019-1506-7) as the single-cell reference, and deconvoluted the spots using [Cell2location](https://github.com/BayraktarLab/cell2location):
73
+
74
+
*[Demonstration of scNiche on Slice 151673 (with deconvolution results)](tutorial/tutorial_dlpfc151673.ipynb)
75
+
76
+
2. Only use the molecular profiles of cells and neighborhoods as input:
77
+
78
+
*[Demonstration of scNiche on Slice 151673 (without deconvolution results)](tutorial/tutorial_dlpfc151673-2view.ipynb)
79
+
80
+
81
+
Multi-slice analysis of 4 slices based on the batch training strategy of scNiche:
82
+
83
+
*[Demonstration of scNiche on 4 slices from the same donor (with deconvolution results)](tutorial/tutorial_DLPFC.ipynb)
84
+
85
+
#### - Spatial multi-omics data
86
+
The strategy of scNiche for modeling features from different views of the cell offers more possible avenues for expansion,
87
+
such as application to spatial multi-omics data. We here ran scNiche on a postnatal day (P)22 mouse brain coronal section
88
+
dataset generated by [Zhang et al.](https://doi.org/10.1038/s41586-023-05795-1), which includes RNA-seq and CUT&Tag (acetylated histone H3 Lys27 (H3K27ac) histone modification) modalities.
89
+
The dataset can be downloaded [here](https://zenodo.org/records/10362607).
90
+
91
+
*[Demonstration of scNiche on the mouse brain spatial CUT&Tag–RNA-seq data](tutorial/tutorial_multi-omics.ipynb)
92
+
50
93
51
94
## Tutorials (characterize cell niches)
52
95
scNiche also offers a downstream analytical framework for characterizing cell niches more comprehensively.
@@ -56,6 +99,9 @@ Here are examples of scNiche on two biological datasets:
56
99
*[Demonstration of scNiche on the mouse liver Seq-Scope data](tutorial/tutorial_liver.ipynb)
57
100
58
101
102
+
## Acknowledgements
103
+
The scNiche model is developed based on the [multi-view clustering framework (CMGEC)](https://github.com/wangemm/CMGEC-TMM-2021). We thank the authors for releasing the codes.
104
+
59
105
## About
60
-
scNiche was developed by Jingyang Qian. Should you have any questions, please contact Jingyang Qian at [email protected].
106
+
scNiche is developed by Jingyang Qian. Should you have any questions, please contact Jingyang Qian at [email protected].
0 commit comments